Show More
@@ -1,19 +1,20 b'' | |||||
1 | List of contributors to RhodeCode project: |
|
1 | List of contributors to RhodeCode project: | |
2 | Marcin KuΕΊmiΕski <marcin@python-works.com> |
|
2 | Marcin KuΕΊmiΕski <marcin@python-works.com> | |
3 | Lukasz Balcerzak <lukaszbalcerzak@gmail.com> |
|
3 | Lukasz Balcerzak <lukaszbalcerzak@gmail.com> | |
4 | Jason Harris <jason@jasonfharris.com> |
|
4 | Jason Harris <jason@jasonfharris.com> | |
5 | Thayne Harbaugh <thayne@fusionio.com> |
|
5 | Thayne Harbaugh <thayne@fusionio.com> | |
6 | cejones |
|
6 | cejones | |
7 | Thomas Waldmann <tw-public@gmx.de> |
|
7 | Thomas Waldmann <tw-public@gmx.de> | |
8 | Lorenzo M. Catucci <lorenzo@sancho.ccd.uniroma2.it> |
|
8 | Lorenzo M. Catucci <lorenzo@sancho.ccd.uniroma2.it> | |
9 | Dmitri Kuznetsov |
|
9 | Dmitri Kuznetsov | |
10 | Jared Bunting <jared.bunting@peachjean.com> |
|
10 | Jared Bunting <jared.bunting@peachjean.com> | |
11 | Steve Romanow <slestak989@gmail.com> |
|
11 | Steve Romanow <slestak989@gmail.com> | |
12 | Augosto Hermann <augusto.herrmann@planejamento.gov.br> |
|
12 | Augosto Hermann <augusto.herrmann@planejamento.gov.br> | |
13 | Ankit Solanki <ankit.solanki@gmail.com> |
|
13 | Ankit Solanki <ankit.solanki@gmail.com> | |
14 | Liad Shani <liadff@gmail.com> |
|
14 | Liad Shani <liadff@gmail.com> | |
15 | Les Peabody <lpeabody@gmail.com> |
|
15 | Les Peabody <lpeabody@gmail.com> | |
16 | Jonas Oberschweiber <jonas.oberschweiber@d-velop.de> |
|
16 | Jonas Oberschweiber <jonas.oberschweiber@d-velop.de> | |
17 | Matt Zuba <matt.zuba@goodwillaz.org> |
|
17 | Matt Zuba <matt.zuba@goodwillaz.org> | |
18 | Aras Pranckevicius <aras@unity3d.com> |
|
18 | Aras Pranckevicius <aras@unity3d.com> | |
19 | Tony Bussieres <t.bussieres@gmail.com> |
|
19 | Tony Bussieres <t.bussieres@gmail.com> | |
|
20 | Erwin Kroon <e.kroon@smartmetersolutions.nl> No newline at end of file |
@@ -1,725 +1,726 b'' | |||||
1 | .. _setup: |
|
1 | .. _setup: | |
2 |
|
2 | |||
3 | Setup |
|
3 | Setup | |
4 | ===== |
|
4 | ===== | |
5 |
|
5 | |||
6 |
|
6 | |||
7 | Setting up RhodeCode |
|
7 | Setting up RhodeCode | |
8 | -------------------- |
|
8 | -------------------- | |
9 |
|
9 | |||
10 | First, you will need to create a RhodeCode configuration file. Run the |
|
10 | First, you will need to create a RhodeCode configuration file. Run the | |
11 | following command to do this:: |
|
11 | following command to do this:: | |
12 |
|
12 | |||
13 | paster make-config RhodeCode production.ini |
|
13 | paster make-config RhodeCode production.ini | |
14 |
|
14 | |||
15 | - This will create the file `production.ini` in the current directory. This |
|
15 | - This will create the file `production.ini` in the current directory. This | |
16 | configuration file contains the various settings for RhodeCode, e.g proxy |
|
16 | configuration file contains the various settings for RhodeCode, e.g proxy | |
17 | port, email settings, usage of static files, cache, celery settings and |
|
17 | port, email settings, usage of static files, cache, celery settings and | |
18 | logging. |
|
18 | logging. | |
19 |
|
19 | |||
20 |
|
20 | |||
21 | Next, you need to create the databases used by RhodeCode. I recommend that you |
|
21 | Next, you need to create the databases used by RhodeCode. I recommend that you | |
22 | use sqlite (default) or postgresql. If you choose a database other than the |
|
22 | use sqlite (default) or postgresql. If you choose a database other than the | |
23 | default ensure you properly adjust the db url in your production.ini |
|
23 | default ensure you properly adjust the db url in your production.ini | |
24 | configuration file to use this other database. Create the databases by running |
|
24 | configuration file to use this other database. Create the databases by running | |
25 | the following command:: |
|
25 | the following command:: | |
26 |
|
26 | |||
27 | paster setup-app production.ini |
|
27 | paster setup-app production.ini | |
28 |
|
28 | |||
29 | This will prompt you for a "root" path. This "root" path is the location where |
|
29 | This will prompt you for a "root" path. This "root" path is the location where | |
30 | RhodeCode will store all of its repositories on the current machine. After |
|
30 | RhodeCode will store all of its repositories on the current machine. After | |
31 | entering this "root" path ``setup-app`` will also prompt you for a username |
|
31 | entering this "root" path ``setup-app`` will also prompt you for a username | |
32 | and password for the initial admin account which ``setup-app`` sets up for you. |
|
32 | and password for the initial admin account which ``setup-app`` sets up for you. | |
33 |
|
33 | |||
34 | - The ``setup-app`` command will create all of the needed tables and an admin |
|
34 | - The ``setup-app`` command will create all of the needed tables and an admin | |
35 | account. When choosing a root path you can either use a new empty location, |
|
35 | account. When choosing a root path you can either use a new empty location, | |
36 | or a location which already contains existing repositories. If you choose a |
|
36 | or a location which already contains existing repositories. If you choose a | |
37 | location which contains existing repositories RhodeCode will simply add all |
|
37 | location which contains existing repositories RhodeCode will simply add all | |
38 | of the repositories at the chosen location to it's database. (Note: make |
|
38 | of the repositories at the chosen location to it's database. (Note: make | |
39 | sure you specify the correct path to the root). |
|
39 | sure you specify the correct path to the root). | |
40 | - Note: the given path for mercurial_ repositories **must** be write accessible |
|
40 | - Note: the given path for mercurial_ repositories **must** be write accessible | |
41 | for the application. It's very important since the RhodeCode web interface |
|
41 | for the application. It's very important since the RhodeCode web interface | |
42 | will work without write access, but when trying to do a push it will |
|
42 | will work without write access, but when trying to do a push it will | |
43 | eventually fail with permission denied errors unless it has write access. |
|
43 | eventually fail with permission denied errors unless it has write access. | |
44 |
|
44 | |||
45 | You are now ready to use RhodeCode, to run it simply execute:: |
|
45 | You are now ready to use RhodeCode, to run it simply execute:: | |
46 |
|
46 | |||
47 | paster serve production.ini |
|
47 | paster serve production.ini | |
48 |
|
48 | |||
49 | - This command runs the RhodeCode server. The web app should be available at the |
|
49 | - This command runs the RhodeCode server. The web app should be available at the | |
50 | 127.0.0.1:5000. This ip and port is configurable via the production.ini |
|
50 | 127.0.0.1:5000. This ip and port is configurable via the production.ini | |
51 | file created in previous step |
|
51 | file created in previous step | |
52 | - Use the admin account you created above when running ``setup-app`` to login |
|
52 | - Use the admin account you created above when running ``setup-app`` to login | |
53 | to the web app. |
|
53 | to the web app. | |
54 | - The default permissions on each repository is read, and the owner is admin. |
|
54 | - The default permissions on each repository is read, and the owner is admin. | |
55 | Remember to update these if needed. |
|
55 | Remember to update these if needed. | |
56 | - In the admin panel you can toggle ldap, anonymous, permissions settings. As |
|
56 | - In the admin panel you can toggle ldap, anonymous, permissions settings. As | |
57 | well as edit more advanced options on users and repositories |
|
57 | well as edit more advanced options on users and repositories | |
58 |
|
58 | |||
59 | Try copying your own mercurial repository into the "root" directory you are |
|
59 | Try copying your own mercurial repository into the "root" directory you are | |
60 | using, then from within the RhodeCode web application choose Admin > |
|
60 | using, then from within the RhodeCode web application choose Admin > | |
61 | repositories. Then choose Add New Repository. Add the repository you copied |
|
61 | repositories. Then choose Add New Repository. Add the repository you copied | |
62 | into the root. Test that you can browse your repository from within RhodeCode |
|
62 | into the root. Test that you can browse your repository from within RhodeCode | |
63 | and then try cloning your repository from RhodeCode with:: |
|
63 | and then try cloning your repository from RhodeCode with:: | |
64 |
|
64 | |||
65 | hg clone http://127.0.0.1:5000/<repository name> |
|
65 | hg clone http://127.0.0.1:5000/<repository name> | |
66 |
|
66 | |||
67 | where *repository name* is replaced by the name of your repository. |
|
67 | where *repository name* is replaced by the name of your repository. | |
68 |
|
68 | |||
69 | Using RhodeCode with SSH |
|
69 | Using RhodeCode with SSH | |
70 | ------------------------ |
|
70 | ------------------------ | |
71 |
|
71 | |||
72 | RhodeCode currently only hosts repositories using http and https. (The addition |
|
72 | RhodeCode currently only hosts repositories using http and https. (The addition | |
73 | of ssh hosting is a planned future feature.) However you can easily use ssh in |
|
73 | of ssh hosting is a planned future feature.) However you can easily use ssh in | |
74 | parallel with RhodeCode. (Repository access via ssh is a standard "out of |
|
74 | parallel with RhodeCode. (Repository access via ssh is a standard "out of | |
75 | the box" feature of mercurial_ and you can use this to access any of the |
|
75 | the box" feature of mercurial_ and you can use this to access any of the | |
76 | repositories that RhodeCode is hosting. See PublishingRepositories_) |
|
76 | repositories that RhodeCode is hosting. See PublishingRepositories_) | |
77 |
|
77 | |||
78 | RhodeCode repository structures are kept in directories with the same name |
|
78 | RhodeCode repository structures are kept in directories with the same name | |
79 | as the project. When using repository groups, each group is a subdirectory. |
|
79 | as the project. When using repository groups, each group is a subdirectory. | |
80 | This allows you to easily use ssh for accessing repositories. |
|
80 | This allows you to easily use ssh for accessing repositories. | |
81 |
|
81 | |||
82 | In order to use ssh you need to make sure that your web-server and the users |
|
82 | In order to use ssh you need to make sure that your web-server and the users | |
83 | login accounts have the correct permissions set on the appropriate directories. |
|
83 | login accounts have the correct permissions set on the appropriate directories. | |
84 | (Note that these permissions are independent of any permissions you have set up |
|
84 | (Note that these permissions are independent of any permissions you have set up | |
85 | using the RhodeCode web interface.) |
|
85 | using the RhodeCode web interface.) | |
86 |
|
86 | |||
87 | If your main directory (the same as set in RhodeCode settings) is for example |
|
87 | If your main directory (the same as set in RhodeCode settings) is for example | |
88 | set to **/home/hg** and the repository you are using is named `rhodecode`, then |
|
88 | set to **/home/hg** and the repository you are using is named `rhodecode`, then | |
89 | to clone via ssh you should run:: |
|
89 | to clone via ssh you should run:: | |
90 |
|
90 | |||
91 | hg clone ssh://user@server.com/home/hg/rhodecode |
|
91 | hg clone ssh://user@server.com/home/hg/rhodecode | |
92 |
|
92 | |||
93 | Using other external tools such as mercurial-server_ or using ssh key based |
|
93 | Using other external tools such as mercurial-server_ or using ssh key based | |
94 | authentication is fully supported. |
|
94 | authentication is fully supported. | |
95 |
|
95 | |||
96 | Note: In an advanced setup, in order for your ssh access to use the same |
|
96 | Note: In an advanced setup, in order for your ssh access to use the same | |
97 | permissions as set up via the RhodeCode web interface, you can create an |
|
97 | permissions as set up via the RhodeCode web interface, you can create an | |
98 | authentication hook to connect to the rhodecode db and runs check functions for |
|
98 | authentication hook to connect to the rhodecode db and runs check functions for | |
99 | permissions against that. |
|
99 | permissions against that. | |
100 |
|
100 | |||
101 | Setting up Whoosh full text search |
|
101 | Setting up Whoosh full text search | |
102 | ---------------------------------- |
|
102 | ---------------------------------- | |
103 |
|
103 | |||
104 | Starting from version 1.1 the whoosh index can be build by using the paster |
|
104 | Starting from version 1.1 the whoosh index can be build by using the paster | |
105 | command ``make-index``. To use ``make-index`` you must specify the configuration |
|
105 | command ``make-index``. To use ``make-index`` you must specify the configuration | |
106 | file that stores the location of the index. You may specify the location of the |
|
106 | file that stores the location of the index. You may specify the location of the | |
107 | repositories (`--repo-location`). If not specified, this value is retrieved |
|
107 | repositories (`--repo-location`). If not specified, this value is retrieved | |
108 | from the RhodeCode database. This was required prior to 1.2. Starting from |
|
108 | from the RhodeCode database. This was required prior to 1.2. Starting from | |
109 | version 1.2 it is also possible to specify a comma separated list of |
|
109 | version 1.2 it is also possible to specify a comma separated list of | |
110 | repositories (`--index-only`) to build index only on chooses repositories |
|
110 | repositories (`--index-only`) to build index only on chooses repositories | |
111 | skipping any other found in repos location |
|
111 | skipping any other found in repos location | |
112 |
|
112 | |||
113 | You may optionally pass the option `-f` to enable a full index rebuild. Without |
|
113 | You may optionally pass the option `-f` to enable a full index rebuild. Without | |
114 | the `-f` option, indexing will run always in "incremental" mode. |
|
114 | the `-f` option, indexing will run always in "incremental" mode. | |
115 |
|
115 | |||
116 | For an incremental index build use:: |
|
116 | For an incremental index build use:: | |
117 |
|
117 | |||
118 | paster make-index production.ini |
|
118 | paster make-index production.ini | |
119 |
|
119 | |||
120 | For a full index rebuild use:: |
|
120 | For a full index rebuild use:: | |
121 |
|
121 | |||
122 | paster make-index production.ini -f |
|
122 | paster make-index production.ini -f | |
123 |
|
123 | |||
124 |
|
124 | |||
125 | building index just for chosen repositories is possible with such command:: |
|
125 | building index just for chosen repositories is possible with such command:: | |
126 |
|
126 | |||
127 | paster make-index production.ini --index-only=vcs,rhodecode |
|
127 | paster make-index production.ini --index-only=vcs,rhodecode | |
128 |
|
128 | |||
129 |
|
129 | |||
130 | In order to do periodical index builds and keep your index always up to date. |
|
130 | In order to do periodical index builds and keep your index always up to date. | |
131 | It's recommended to do a crontab entry for incremental indexing. |
|
131 | It's recommended to do a crontab entry for incremental indexing. | |
132 | An example entry might look like this:: |
|
132 | An example entry might look like this:: | |
133 |
|
133 | |||
134 | /path/to/python/bin/paster make-index /path/to/rhodecode/production.ini |
|
134 | /path/to/python/bin/paster make-index /path/to/rhodecode/production.ini | |
135 |
|
135 | |||
136 | When using incremental mode (the default) whoosh will check the last |
|
136 | When using incremental mode (the default) whoosh will check the last | |
137 | modification date of each file and add it to be reindexed if a newer file is |
|
137 | modification date of each file and add it to be reindexed if a newer file is | |
138 | available. The indexing daemon checks for any removed files and removes them |
|
138 | available. The indexing daemon checks for any removed files and removes them | |
139 | from index. |
|
139 | from index. | |
140 |
|
140 | |||
141 | If you want to rebuild index from scratch, you can use the `-f` flag as above, |
|
141 | If you want to rebuild index from scratch, you can use the `-f` flag as above, | |
142 | or in the admin panel you can check `build from scratch` flag. |
|
142 | or in the admin panel you can check `build from scratch` flag. | |
143 |
|
143 | |||
144 |
|
144 | |||
145 | Setting up LDAP support |
|
145 | Setting up LDAP support | |
146 | ----------------------- |
|
146 | ----------------------- | |
147 |
|
147 | |||
148 | RhodeCode starting from version 1.1 supports ldap authentication. In order |
|
148 | RhodeCode starting from version 1.1 supports ldap authentication. In order | |
149 | to use LDAP, you have to install the python-ldap_ package. This package is |
|
149 | to use LDAP, you have to install the python-ldap_ package. This package is | |
150 | available via pypi, so you can install it by running |
|
150 | available via pypi, so you can install it by running | |
151 |
|
151 | |||
152 | using easy_install:: |
|
152 | using easy_install:: | |
153 |
|
153 | |||
154 | easy_install python-ldap |
|
154 | easy_install python-ldap | |
155 |
|
155 | |||
156 | using pip:: |
|
156 | using pip:: | |
157 |
|
157 | |||
158 | pip install python-ldap |
|
158 | pip install python-ldap | |
159 |
|
159 | |||
160 | .. note:: |
|
160 | .. note:: | |
161 | python-ldap requires some certain libs on your system, so before installing |
|
161 | python-ldap requires some certain libs on your system, so before installing | |
162 | it check that you have at least `openldap`, and `sasl` libraries. |
|
162 | it check that you have at least `openldap`, and `sasl` libraries. | |
163 |
|
163 | |||
164 | LDAP settings are located in admin->ldap section, |
|
164 | LDAP settings are located in admin->ldap section, | |
165 |
|
165 | |||
166 | Here's a typical ldap setup:: |
|
166 | Here's a typical ldap setup:: | |
167 |
|
167 | |||
168 | Connection settings |
|
168 | Connection settings | |
169 | Enable LDAP = checked |
|
169 | Enable LDAP = checked | |
170 | Host = host.example.org |
|
170 | Host = host.example.org | |
171 | Port = 389 |
|
171 | Port = 389 | |
172 | Account = <account> |
|
172 | Account = <account> | |
173 | Password = <password> |
|
173 | Password = <password> | |
174 | Connection Security = LDAPS connection |
|
174 | Connection Security = LDAPS connection | |
175 | Certificate Checks = DEMAND |
|
175 | Certificate Checks = DEMAND | |
176 |
|
176 | |||
177 | Search settings |
|
177 | Search settings | |
178 | Base DN = CN=users,DC=host,DC=example,DC=org |
|
178 | Base DN = CN=users,DC=host,DC=example,DC=org | |
179 | LDAP Filter = (&(objectClass=user)(!(objectClass=computer))) |
|
179 | LDAP Filter = (&(objectClass=user)(!(objectClass=computer))) | |
180 | LDAP Search Scope = SUBTREE |
|
180 | LDAP Search Scope = SUBTREE | |
181 |
|
181 | |||
182 | Attribute mappings |
|
182 | Attribute mappings | |
183 | Login Attribute = uid |
|
183 | Login Attribute = uid | |
184 | First Name Attribute = firstName |
|
184 | First Name Attribute = firstName | |
185 | Last Name Attribute = lastName |
|
185 | Last Name Attribute = lastName | |
186 | E-mail Attribute = mail |
|
186 | E-mail Attribute = mail | |
187 |
|
187 | |||
188 | .. _enable_ldap: |
|
188 | .. _enable_ldap: | |
189 |
|
189 | |||
190 | Enable LDAP : required |
|
190 | Enable LDAP : required | |
191 | Whether to use LDAP for authenticating users. |
|
191 | Whether to use LDAP for authenticating users. | |
192 |
|
192 | |||
193 | .. _ldap_host: |
|
193 | .. _ldap_host: | |
194 |
|
194 | |||
195 | Host : required |
|
195 | Host : required | |
196 | LDAP server hostname or IP address. |
|
196 | LDAP server hostname or IP address. | |
197 |
|
197 | |||
198 | .. _Port: |
|
198 | .. _Port: | |
199 |
|
199 | |||
200 | Port : required |
|
200 | Port : required | |
201 | 389 for un-encrypted LDAP, 636 for SSL-encrypted LDAP. |
|
201 | 389 for un-encrypted LDAP, 636 for SSL-encrypted LDAP. | |
202 |
|
202 | |||
203 | .. _ldap_account: |
|
203 | .. _ldap_account: | |
204 |
|
204 | |||
205 | Account : optional |
|
205 | Account : optional | |
206 | Only required if the LDAP server does not allow anonymous browsing of |
|
206 | Only required if the LDAP server does not allow anonymous browsing of | |
207 | records. This should be a special account for record browsing. This |
|
207 | records. This should be a special account for record browsing. This | |
208 | will require `LDAP Password`_ below. |
|
208 | will require `LDAP Password`_ below. | |
209 |
|
209 | |||
210 | .. _LDAP Password: |
|
210 | .. _LDAP Password: | |
211 |
|
211 | |||
212 | Password : optional |
|
212 | Password : optional | |
213 | Only required if the LDAP server does not allow anonymous browsing of |
|
213 | Only required if the LDAP server does not allow anonymous browsing of | |
214 | records. |
|
214 | records. | |
215 |
|
215 | |||
216 | .. _Enable LDAPS: |
|
216 | .. _Enable LDAPS: | |
217 |
|
217 | |||
218 | Connection Security : required |
|
218 | Connection Security : required | |
219 | Defines the connection to LDAP server |
|
219 | Defines the connection to LDAP server | |
220 |
|
220 | |||
221 | No encryption |
|
221 | No encryption | |
222 | Plain non encrypted connection |
|
222 | Plain non encrypted connection | |
223 |
|
223 | |||
224 | LDAPS connection |
|
224 | LDAPS connection | |
225 | Enable ldaps connection. It will likely require `Port`_ to be set to |
|
225 | Enable ldaps connection. It will likely require `Port`_ to be set to | |
226 | a different value (standard LDAPS port is 636). When LDAPS is enabled |
|
226 | a different value (standard LDAPS port is 636). When LDAPS is enabled | |
227 | then `Certificate Checks`_ is required. |
|
227 | then `Certificate Checks`_ is required. | |
228 |
|
228 | |||
229 | START_TLS on LDAP connection |
|
229 | START_TLS on LDAP connection | |
230 | START TLS connection |
|
230 | START TLS connection | |
231 |
|
231 | |||
232 | .. _Certificate Checks: |
|
232 | .. _Certificate Checks: | |
233 |
|
233 | |||
234 | Certificate Checks : optional |
|
234 | Certificate Checks : optional | |
235 | How SSL certificates verification is handled - this is only useful when |
|
235 | How SSL certificates verification is handled - this is only useful when | |
236 | `Enable LDAPS`_ is enabled. Only DEMAND or HARD offer full SSL security |
|
236 | `Enable LDAPS`_ is enabled. Only DEMAND or HARD offer full SSL security | |
237 | while the other options are susceptible to man-in-the-middle attacks. SSL |
|
237 | while the other options are susceptible to man-in-the-middle attacks. SSL | |
238 | certificates can be installed to /etc/openldap/cacerts so that the |
|
238 | certificates can be installed to /etc/openldap/cacerts so that the | |
239 | DEMAND or HARD options can be used with self-signed certificates or |
|
239 | DEMAND or HARD options can be used with self-signed certificates or | |
240 | certificates that do not have traceable certificates of authority. |
|
240 | certificates that do not have traceable certificates of authority. | |
241 |
|
241 | |||
242 | NEVER |
|
242 | NEVER | |
243 | A serve certificate will never be requested or checked. |
|
243 | A serve certificate will never be requested or checked. | |
244 |
|
244 | |||
245 | ALLOW |
|
245 | ALLOW | |
246 | A server certificate is requested. Failure to provide a |
|
246 | A server certificate is requested. Failure to provide a | |
247 | certificate or providing a bad certificate will not terminate the |
|
247 | certificate or providing a bad certificate will not terminate the | |
248 | session. |
|
248 | session. | |
249 |
|
249 | |||
250 | TRY |
|
250 | TRY | |
251 | A server certificate is requested. Failure to provide a |
|
251 | A server certificate is requested. Failure to provide a | |
252 | certificate does not halt the session; providing a bad certificate |
|
252 | certificate does not halt the session; providing a bad certificate | |
253 | halts the session. |
|
253 | halts the session. | |
254 |
|
254 | |||
255 | DEMAND |
|
255 | DEMAND | |
256 | A server certificate is requested and must be provided and |
|
256 | A server certificate is requested and must be provided and | |
257 | authenticated for the session to proceed. |
|
257 | authenticated for the session to proceed. | |
258 |
|
258 | |||
259 | HARD |
|
259 | HARD | |
260 | The same as DEMAND. |
|
260 | The same as DEMAND. | |
261 |
|
261 | |||
262 | .. _Base DN: |
|
262 | .. _Base DN: | |
263 |
|
263 | |||
264 | Base DN : required |
|
264 | Base DN : required | |
265 | The Distinguished Name (DN) where searches for users will be performed. |
|
265 | The Distinguished Name (DN) where searches for users will be performed. | |
266 | Searches can be controlled by `LDAP Filter`_ and `LDAP Search Scope`_. |
|
266 | Searches can be controlled by `LDAP Filter`_ and `LDAP Search Scope`_. | |
267 |
|
267 | |||
268 | .. _LDAP Filter: |
|
268 | .. _LDAP Filter: | |
269 |
|
269 | |||
270 | LDAP Filter : optional |
|
270 | LDAP Filter : optional | |
271 | A LDAP filter defined by RFC 2254. This is more useful when `LDAP |
|
271 | A LDAP filter defined by RFC 2254. This is more useful when `LDAP | |
272 | Search Scope`_ is set to SUBTREE. The filter is useful for limiting |
|
272 | Search Scope`_ is set to SUBTREE. The filter is useful for limiting | |
273 | which LDAP objects are identified as representing Users for |
|
273 | which LDAP objects are identified as representing Users for | |
274 | authentication. The filter is augmented by `Login Attribute`_ below. |
|
274 | authentication. The filter is augmented by `Login Attribute`_ below. | |
275 | This can commonly be left blank. |
|
275 | This can commonly be left blank. | |
276 |
|
276 | |||
277 | .. _LDAP Search Scope: |
|
277 | .. _LDAP Search Scope: | |
278 |
|
278 | |||
279 | LDAP Search Scope : required |
|
279 | LDAP Search Scope : required | |
280 | This limits how far LDAP will search for a matching object. |
|
280 | This limits how far LDAP will search for a matching object. | |
281 |
|
281 | |||
282 | BASE |
|
282 | BASE | |
283 | Only allows searching of `Base DN`_ and is usually not what you |
|
283 | Only allows searching of `Base DN`_ and is usually not what you | |
284 | want. |
|
284 | want. | |
285 |
|
285 | |||
286 | ONELEVEL |
|
286 | ONELEVEL | |
287 | Searches all entries under `Base DN`_, but not Base DN itself. |
|
287 | Searches all entries under `Base DN`_, but not Base DN itself. | |
288 |
|
288 | |||
289 | SUBTREE |
|
289 | SUBTREE | |
290 | Searches all entries below `Base DN`_, but not Base DN itself. |
|
290 | Searches all entries below `Base DN`_, but not Base DN itself. | |
291 | When using SUBTREE `LDAP Filter`_ is useful to limit object |
|
291 | When using SUBTREE `LDAP Filter`_ is useful to limit object | |
292 | location. |
|
292 | location. | |
293 |
|
293 | |||
294 | .. _Login Attribute: |
|
294 | .. _Login Attribute: | |
295 |
|
295 | |||
296 | Login Attribute : required |
|
296 | Login Attribute : required | |
297 | The LDAP record attribute that will be matched as the USERNAME or |
|
297 | The LDAP record attribute that will be matched as the USERNAME or | |
298 | ACCOUNT used to connect to RhodeCode. This will be added to `LDAP |
|
298 | ACCOUNT used to connect to RhodeCode. This will be added to `LDAP | |
299 | Filter`_ for locating the User object. If `LDAP Filter`_ is specified as |
|
299 | Filter`_ for locating the User object. If `LDAP Filter`_ is specified as | |
300 | "LDAPFILTER", `Login Attribute`_ is specified as "uid" and the user has |
|
300 | "LDAPFILTER", `Login Attribute`_ is specified as "uid" and the user has | |
301 | connected as "jsmith" then the `LDAP Filter`_ will be augmented as below |
|
301 | connected as "jsmith" then the `LDAP Filter`_ will be augmented as below | |
302 | :: |
|
302 | :: | |
303 |
|
303 | |||
304 | (&(LDAPFILTER)(uid=jsmith)) |
|
304 | (&(LDAPFILTER)(uid=jsmith)) | |
305 |
|
305 | |||
306 | .. _ldap_attr_firstname: |
|
306 | .. _ldap_attr_firstname: | |
307 |
|
307 | |||
308 | First Name Attribute : required |
|
308 | First Name Attribute : required | |
309 | The LDAP record attribute which represents the user's first name. |
|
309 | The LDAP record attribute which represents the user's first name. | |
310 |
|
310 | |||
311 | .. _ldap_attr_lastname: |
|
311 | .. _ldap_attr_lastname: | |
312 |
|
312 | |||
313 | Last Name Attribute : required |
|
313 | Last Name Attribute : required | |
314 | The LDAP record attribute which represents the user's last name. |
|
314 | The LDAP record attribute which represents the user's last name. | |
315 |
|
315 | |||
316 | .. _ldap_attr_email: |
|
316 | .. _ldap_attr_email: | |
317 |
|
317 | |||
318 | Email Attribute : required |
|
318 | Email Attribute : required | |
319 | The LDAP record attribute which represents the user's email address. |
|
319 | The LDAP record attribute which represents the user's email address. | |
320 |
|
320 | |||
321 | If all data are entered correctly, and python-ldap_ is properly installed |
|
321 | If all data are entered correctly, and python-ldap_ is properly installed | |
322 | users should be granted access to RhodeCode with ldap accounts. At this |
|
322 | users should be granted access to RhodeCode with ldap accounts. At this | |
323 | time user information is copied from LDAP into the RhodeCode user database. |
|
323 | time user information is copied from LDAP into the RhodeCode user database. | |
324 | This means that updates of an LDAP user object may not be reflected as a |
|
324 | This means that updates of an LDAP user object may not be reflected as a | |
325 | user update in RhodeCode. |
|
325 | user update in RhodeCode. | |
326 |
|
326 | |||
327 | If You have problems with LDAP access and believe You entered correct |
|
327 | If You have problems with LDAP access and believe You entered correct | |
328 | information check out the RhodeCode logs, any error messages sent from LDAP |
|
328 | information check out the RhodeCode logs, any error messages sent from LDAP | |
329 | will be saved there. |
|
329 | will be saved there. | |
330 |
|
330 | |||
331 | Active Directory |
|
331 | Active Directory | |
332 | '''''''''''''''' |
|
332 | '''''''''''''''' | |
333 |
|
333 | |||
334 | RhodeCode can use Microsoft Active Directory for user authentication. This |
|
334 | RhodeCode can use Microsoft Active Directory for user authentication. This | |
335 | is done through an LDAP or LDAPS connection to Active Directory. The |
|
335 | is done through an LDAP or LDAPS connection to Active Directory. The | |
336 | following LDAP configuration settings are typical for using Active |
|
336 | following LDAP configuration settings are typical for using Active | |
337 | Directory :: |
|
337 | Directory :: | |
338 |
|
338 | |||
339 | Base DN = OU=SBSUsers,OU=Users,OU=MyBusiness,DC=v3sys,DC=local |
|
339 | Base DN = OU=SBSUsers,OU=Users,OU=MyBusiness,DC=v3sys,DC=local | |
340 | Login Attribute = sAMAccountName |
|
340 | Login Attribute = sAMAccountName | |
341 | First Name Attribute = givenName |
|
341 | First Name Attribute = givenName | |
342 | Last Name Attribute = sn |
|
342 | Last Name Attribute = sn | |
343 | E-mail Attribute = mail |
|
343 | E-mail Attribute = mail | |
344 |
|
344 | |||
345 | All other LDAP settings will likely be site-specific and should be |
|
345 | All other LDAP settings will likely be site-specific and should be | |
346 | appropriately configured. |
|
346 | appropriately configured. | |
347 |
|
347 | |||
348 |
|
348 | |||
349 | Authentication by container or reverse-proxy |
|
349 | Authentication by container or reverse-proxy | |
350 | -------------------------------------------- |
|
350 | -------------------------------------------- | |
351 |
|
351 | |||
352 | Starting with version 1.3, RhodeCode supports delegating the authentication |
|
352 | Starting with version 1.3, RhodeCode supports delegating the authentication | |
353 | of users to its WSGI container, or to a reverse-proxy server through which all |
|
353 | of users to its WSGI container, or to a reverse-proxy server through which all | |
354 | clients access the application. |
|
354 | clients access the application. | |
355 |
|
355 | |||
356 | When these authentication methods are enabled in RhodeCode, it uses the |
|
356 | When these authentication methods are enabled in RhodeCode, it uses the | |
357 | username that the container/proxy (Apache/Nginx/etc) authenticated and doesn't |
|
357 | username that the container/proxy (Apache/Nginx/etc) authenticated and doesn't | |
358 | perform the authentication itself. The authorization, however, is still done by |
|
358 | perform the authentication itself. The authorization, however, is still done by | |
359 | RhodeCode according to its settings. |
|
359 | RhodeCode according to its settings. | |
360 |
|
360 | |||
361 | When a user logs in for the first time using these authentication methods, |
|
361 | When a user logs in for the first time using these authentication methods, | |
362 | a matching user account is created in RhodeCode with default permissions. An |
|
362 | a matching user account is created in RhodeCode with default permissions. An | |
363 | administrator can then modify it using RhodeCode's admin interface. |
|
363 | administrator can then modify it using RhodeCode's admin interface. | |
364 | It's also possible for an administrator to create accounts and configure their |
|
364 | It's also possible for an administrator to create accounts and configure their | |
365 | permissions before the user logs in for the first time. |
|
365 | permissions before the user logs in for the first time. | |
366 |
|
366 | |||
367 | Container-based authentication |
|
367 | Container-based authentication | |
368 | '''''''''''''''''''''''''''''' |
|
368 | '''''''''''''''''''''''''''''' | |
369 |
|
369 | |||
370 | In a container-based authentication setup, RhodeCode reads the user name from |
|
370 | In a container-based authentication setup, RhodeCode reads the user name from | |
371 | the ``REMOTE_USER`` server variable provided by the WSGI container. |
|
371 | the ``REMOTE_USER`` server variable provided by the WSGI container. | |
372 |
|
372 | |||
373 | After setting up your container (see `Apache's WSGI config`_), you'd need |
|
373 | After setting up your container (see `Apache's WSGI config`_), you'd need | |
374 | to configure it to require authentication on the location configured for |
|
374 | to configure it to require authentication on the location configured for | |
375 | RhodeCode. |
|
375 | RhodeCode. | |
376 |
|
376 | |||
377 | In order for RhodeCode to start using the provided username, you should set the |
|
377 | In order for RhodeCode to start using the provided username, you should set the | |
378 | following in the [app:main] section of your .ini file:: |
|
378 | following in the [app:main] section of your .ini file:: | |
379 |
|
379 | |||
380 | container_auth_enabled = true |
|
380 | container_auth_enabled = true | |
381 |
|
381 | |||
382 |
|
382 | |||
383 | Proxy pass-through authentication |
|
383 | Proxy pass-through authentication | |
384 | ''''''''''''''''''''''''''''''''' |
|
384 | ''''''''''''''''''''''''''''''''' | |
385 |
|
385 | |||
386 | In a proxy pass-through authentication setup, RhodeCode reads the user name |
|
386 | In a proxy pass-through authentication setup, RhodeCode reads the user name | |
387 | from the ``X-Forwarded-User`` request header, which should be configured to be |
|
387 | from the ``X-Forwarded-User`` request header, which should be configured to be | |
388 | sent by the reverse-proxy server. |
|
388 | sent by the reverse-proxy server. | |
389 |
|
389 | |||
390 | After setting up your proxy solution (see `Apache virtual host reverse proxy example`_, |
|
390 | After setting up your proxy solution (see `Apache virtual host reverse proxy example`_, | |
391 | `Apache as subdirectory`_ or `Nginx virtual host example`_), you'd need to |
|
391 | `Apache as subdirectory`_ or `Nginx virtual host example`_), you'd need to | |
392 | configure the authentication and add the username in a request header named |
|
392 | configure the authentication and add the username in a request header named | |
393 | ``X-Forwarded-User``. |
|
393 | ``X-Forwarded-User``. | |
394 |
|
394 | |||
395 | For example, the following config section for Apache sets a subdirectory in a |
|
395 | For example, the following config section for Apache sets a subdirectory in a | |
396 | reverse-proxy setup with basic auth:: |
|
396 | reverse-proxy setup with basic auth:: | |
397 |
|
397 | |||
398 | <Location /<someprefix> > |
|
398 | <Location /<someprefix> > | |
399 | ProxyPass http://127.0.0.1:5000/<someprefix> |
|
399 | ProxyPass http://127.0.0.1:5000/<someprefix> | |
400 | ProxyPassReverse http://127.0.0.1:5000/<someprefix> |
|
400 | ProxyPassReverse http://127.0.0.1:5000/<someprefix> | |
401 | SetEnvIf X-Url-Scheme https HTTPS=1 |
|
401 | SetEnvIf X-Url-Scheme https HTTPS=1 | |
402 |
|
402 | |||
403 | AuthType Basic |
|
403 | AuthType Basic | |
404 | AuthName "RhodeCode authentication" |
|
404 | AuthName "RhodeCode authentication" | |
405 | AuthUserFile /home/web/rhodecode/.htpasswd |
|
405 | AuthUserFile /home/web/rhodecode/.htpasswd | |
406 | require valid-user |
|
406 | require valid-user | |
407 |
|
407 | |||
408 | RequestHeader unset X-Forwarded-User |
|
408 | RequestHeader unset X-Forwarded-User | |
409 |
|
409 | |||
410 | RewriteEngine On |
|
410 | RewriteEngine On | |
411 | RewriteCond %{LA-U:REMOTE_USER} (.+) |
|
411 | RewriteCond %{LA-U:REMOTE_USER} (.+) | |
412 | RewriteRule .* - [E=RU:%1] |
|
412 | RewriteRule .* - [E=RU:%1] | |
413 | RequestHeader set X-Forwarded-User %{RU}e |
|
413 | RequestHeader set X-Forwarded-User %{RU}e | |
414 | </Location> |
|
414 | </Location> | |
415 |
|
415 | |||
416 | In order for RhodeCode to start using the forwarded username, you should set |
|
416 | In order for RhodeCode to start using the forwarded username, you should set | |
417 | the following in the [app:main] section of your .ini file:: |
|
417 | the following in the [app:main] section of your .ini file:: | |
418 |
|
418 | |||
419 | proxypass_auth_enabled = true |
|
419 | proxypass_auth_enabled = true | |
420 |
|
420 | |||
421 | .. note:: |
|
421 | .. note:: | |
422 | If you enable proxy pass-through authentication, make sure your server is |
|
422 | If you enable proxy pass-through authentication, make sure your server is | |
423 | only accessible through the proxy. Otherwise, any client would be able to |
|
423 | only accessible through the proxy. Otherwise, any client would be able to | |
424 | forge the authentication header and could effectively become authenticated |
|
424 | forge the authentication header and could effectively become authenticated | |
425 | using any account of their liking. |
|
425 | using any account of their liking. | |
426 |
|
426 | |||
427 | Integration with Issue trackers |
|
427 | Integration with Issue trackers | |
428 | ------------------------------- |
|
428 | ------------------------------- | |
429 |
|
429 | |||
430 | RhodeCode provides a simple integration with issue trackers. It's possible |
|
430 | RhodeCode provides a simple integration with issue trackers. It's possible | |
431 | to define a regular expression that will fetch issue id stored in commit |
|
431 | to define a regular expression that will fetch issue id stored in commit | |
432 | messages and replace that with an url to this issue. To enable this simply |
|
432 | messages and replace that with an url to this issue. To enable this simply | |
433 | uncomment following variables in the ini file:: |
|
433 | uncomment following variables in the ini file:: | |
434 |
|
434 | |||
435 | url_pat = (?:^#|\s#)(\w+) |
|
435 | url_pat = (?:^#|\s#)(\w+) | |
436 | issue_server_link = https://myissueserver.com/{repo}/issue/{id} |
|
436 | issue_server_link = https://myissueserver.com/{repo}/issue/{id} | |
437 | issue_prefix = # |
|
437 | issue_prefix = # | |
438 |
|
438 | |||
439 | `url_pat` is the regular expression that will fetch issues from commit messages. |
|
439 | `url_pat` is the regular expression that will fetch issues from commit messages. | |
440 | Default regex will match issues in format of #<number> eg. #300. |
|
440 | Default regex will match issues in format of #<number> eg. #300. | |
441 |
|
441 | |||
442 | Matched issues will be replace with the link specified as `issue_server_link` |
|
442 | Matched issues will be replace with the link specified as `issue_server_link` | |
443 | {id} will be replaced with issue id, and {repo} with repository name. |
|
443 | {id} will be replaced with issue id, and {repo} with repository name. | |
444 | Since the # is striped `issue_prefix` is added as a prefix to url. |
|
444 | Since the # is striped `issue_prefix` is added as a prefix to url. | |
445 | `issue_prefix` can be something different than # if you pass |
|
445 | `issue_prefix` can be something different than # if you pass | |
446 | ISSUE- as issue prefix this will generate an url in format:: |
|
446 | ISSUE- as issue prefix this will generate an url in format:: | |
447 |
|
447 | |||
448 | <a href="https://myissueserver.com/example_repo/issue/300">ISSUE-300</a> |
|
448 | <a href="https://myissueserver.com/example_repo/issue/300">ISSUE-300</a> | |
449 |
|
449 | |||
450 | Hook management |
|
450 | Hook management | |
451 | --------------- |
|
451 | --------------- | |
452 |
|
452 | |||
453 | Hooks can be managed in similar way to this used in .hgrc files. |
|
453 | Hooks can be managed in similar way to this used in .hgrc files. | |
454 | To access hooks setting click `advanced setup` on Hooks section of Mercurial |
|
454 | To access hooks setting click `advanced setup` on Hooks section of Mercurial | |
455 | Settings in Admin. |
|
455 | Settings in Admin. | |
456 |
|
456 | |||
457 | There are 4 built in hooks that cannot be changed (only enable/disable by |
|
457 | There are 4 built in hooks that cannot be changed (only enable/disable by | |
458 | checkboxes on previos section). |
|
458 | checkboxes on previos section). | |
459 | To add another custom hook simply fill in first section with |
|
459 | To add another custom hook simply fill in first section with | |
460 | <name>.<hook_type> and the second one with hook path. Example hooks |
|
460 | <name>.<hook_type> and the second one with hook path. Example hooks | |
461 | can be found at *rhodecode.lib.hooks*. |
|
461 | can be found at *rhodecode.lib.hooks*. | |
462 |
|
462 | |||
463 |
|
463 | |||
464 | Changing default encoding |
|
464 | Changing default encoding | |
465 | ------------------------- |
|
465 | ------------------------- | |
466 |
|
466 | |||
467 | By default RhodeCode uses utf8 encoding, starting from 1.3 series this |
|
467 | By default RhodeCode uses utf8 encoding, starting from 1.3 series this | |
468 | can be changed, simply edit default_encoding in .ini file to desired one. |
|
468 | can be changed, simply edit default_encoding in .ini file to desired one. | |
469 | This affects many parts in rhodecode including commiters names, filenames, |
|
469 | This affects many parts in rhodecode including commiters names, filenames, | |
470 | encoding of commit messages. In addition RhodeCode can detect if `chardet` |
|
470 | encoding of commit messages. In addition RhodeCode can detect if `chardet` | |
471 | library is installed. If `chardet` is detected RhodeCode will fallback to it |
|
471 | library is installed. If `chardet` is detected RhodeCode will fallback to it | |
472 | when there are encode/decode errors. |
|
472 | when there are encode/decode errors. | |
473 |
|
473 | |||
474 |
|
474 | |||
475 | Setting Up Celery |
|
475 | Setting Up Celery | |
476 | ----------------- |
|
476 | ----------------- | |
477 |
|
477 | |||
478 | Since version 1.1 celery is configured by the rhodecode ini configuration files. |
|
478 | Since version 1.1 celery is configured by the rhodecode ini configuration files. | |
479 | Simply set use_celery=true in the ini file then add / change the configuration |
|
479 | Simply set use_celery=true in the ini file then add / change the configuration | |
480 | variables inside the ini file. |
|
480 | variables inside the ini file. | |
481 |
|
481 | |||
482 | Remember that the ini files use the format with '.' not with '_' like celery. |
|
482 | Remember that the ini files use the format with '.' not with '_' like celery. | |
483 | So for example setting `BROKER_HOST` in celery means setting `broker.host` in |
|
483 | So for example setting `BROKER_HOST` in celery means setting `broker.host` in | |
484 | the config file. |
|
484 | the config file. | |
485 |
|
485 | |||
486 | In order to start using celery run:: |
|
486 | In order to start using celery run:: | |
487 |
|
487 | |||
488 | paster celeryd <configfile.ini> |
|
488 | paster celeryd <configfile.ini> | |
489 |
|
489 | |||
490 |
|
490 | |||
491 | .. note:: |
|
491 | .. note:: | |
492 | Make sure you run this command from the same virtualenv, and with the same |
|
492 | Make sure you run this command from the same virtualenv, and with the same | |
493 | user that rhodecode runs. |
|
493 | user that rhodecode runs. | |
494 |
|
494 | |||
495 | HTTPS support |
|
495 | HTTPS support | |
496 | ------------- |
|
496 | ------------- | |
497 |
|
497 | |||
498 | There are two ways to enable https: |
|
498 | There are two ways to enable https: | |
499 |
|
499 | |||
500 | - Set HTTP_X_URL_SCHEME in your http server headers, than rhodecode will |
|
500 | - Set HTTP_X_URL_SCHEME in your http server headers, than rhodecode will | |
501 | recognize this headers and make proper https redirections |
|
501 | recognize this headers and make proper https redirections | |
502 | - Alternatively, change the `force_https = true` flag in the ini configuration |
|
502 | - Alternatively, change the `force_https = true` flag in the ini configuration | |
503 | to force using https, no headers are needed than to enable https |
|
503 | to force using https, no headers are needed than to enable https | |
504 |
|
504 | |||
505 |
|
505 | |||
506 | Nginx virtual host example |
|
506 | Nginx virtual host example | |
507 | -------------------------- |
|
507 | -------------------------- | |
508 |
|
508 | |||
509 | Sample config for nginx using proxy:: |
|
509 | Sample config for nginx using proxy:: | |
510 |
|
510 | |||
511 | upstream rc { |
|
511 | upstream rc { | |
512 | server 127.0.0.1:5000; |
|
512 | server 127.0.0.1:5000; | |
513 | # add more instances for load balancing |
|
513 | # add more instances for load balancing | |
514 | #server 127.0.0.1:5001; |
|
514 | #server 127.0.0.1:5001; | |
515 | #server 127.0.0.1:5002; |
|
515 | #server 127.0.0.1:5002; | |
516 | } |
|
516 | } | |
517 |
|
517 | |||
518 | server { |
|
518 | server { | |
519 | listen 80; |
|
519 | listen 80; | |
520 | server_name hg.myserver.com; |
|
520 | server_name hg.myserver.com; | |
521 | access_log /var/log/nginx/rhodecode.access.log; |
|
521 | access_log /var/log/nginx/rhodecode.access.log; | |
522 | error_log /var/log/nginx/rhodecode.error.log; |
|
522 | error_log /var/log/nginx/rhodecode.error.log; | |
523 |
|
523 | |||
524 | location / { |
|
524 | location / { | |
525 | try_files $uri @rhode; |
|
525 | try_files $uri @rhode; | |
526 | } |
|
526 | } | |
527 |
|
527 | |||
528 | location @rhode { |
|
528 | location @rhode { | |
529 | proxy_pass http://rc; |
|
529 | proxy_pass http://rc; | |
530 | include /etc/nginx/proxy.conf; |
|
530 | include /etc/nginx/proxy.conf; | |
531 | } |
|
531 | } | |
532 |
|
532 | |||
533 | } |
|
533 | } | |
534 |
|
534 | |||
535 | Here's the proxy.conf. It's tuned so it will not timeout on long |
|
535 | Here's the proxy.conf. It's tuned so it will not timeout on long | |
536 | pushes or large pushes:: |
|
536 | pushes or large pushes:: | |
537 |
|
537 | |||
538 | proxy_redirect off; |
|
538 | proxy_redirect off; | |
539 | proxy_set_header Host $host; |
|
539 | proxy_set_header Host $host; | |
540 | proxy_set_header X-Url-Scheme $scheme; |
|
540 | proxy_set_header X-Url-Scheme $scheme; | |
541 | proxy_set_header X-Host $http_host; |
|
541 | proxy_set_header X-Host $http_host; | |
542 | proxy_set_header X-Real-IP $remote_addr; |
|
542 | proxy_set_header X-Real-IP $remote_addr; | |
543 | proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; |
|
543 | proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; | |
544 | proxy_set_header Proxy-host $proxy_host; |
|
544 | proxy_set_header Proxy-host $proxy_host; | |
545 | client_max_body_size 400m; |
|
545 | client_max_body_size 400m; | |
546 | client_body_buffer_size 128k; |
|
546 | client_body_buffer_size 128k; | |
547 | proxy_buffering off; |
|
547 | proxy_buffering off; | |
548 | proxy_connect_timeout 7200; |
|
548 | proxy_connect_timeout 7200; | |
549 | proxy_send_timeout 7200; |
|
549 | proxy_send_timeout 7200; | |
550 | proxy_read_timeout 7200; |
|
550 | proxy_read_timeout 7200; | |
551 | proxy_buffers 8 32k; |
|
551 | proxy_buffers 8 32k; | |
552 |
|
552 | |||
553 | Also, when using root path with nginx you might set the static files to false |
|
553 | Also, when using root path with nginx you might set the static files to false | |
554 | in the production.ini file:: |
|
554 | in the production.ini file:: | |
555 |
|
555 | |||
556 | [app:main] |
|
556 | [app:main] | |
557 | use = egg:rhodecode |
|
557 | use = egg:rhodecode | |
558 | full_stack = true |
|
558 | full_stack = true | |
559 | static_files = false |
|
559 | static_files = false | |
560 | lang=en |
|
560 | lang=en | |
561 | cache_dir = %(here)s/data |
|
561 | cache_dir = %(here)s/data | |
562 |
|
562 | |||
563 | In order to not have the statics served by the application. This improves speed. |
|
563 | In order to not have the statics served by the application. This improves speed. | |
564 |
|
564 | |||
565 |
|
565 | |||
566 | Apache virtual host reverse proxy example |
|
566 | Apache virtual host reverse proxy example | |
567 | ----------------------------------------- |
|
567 | ----------------------------------------- | |
568 |
|
568 | |||
569 | Here is a sample configuration file for apache using proxy:: |
|
569 | Here is a sample configuration file for apache using proxy:: | |
570 |
|
570 | |||
571 | <VirtualHost *:80> |
|
571 | <VirtualHost *:80> | |
572 | ServerName hg.myserver.com |
|
572 | ServerName hg.myserver.com | |
573 | ServerAlias hg.myserver.com |
|
573 | ServerAlias hg.myserver.com | |
574 |
|
574 | |||
575 | <Proxy *> |
|
575 | <Proxy *> | |
576 | Order allow,deny |
|
576 | Order allow,deny | |
577 | Allow from all |
|
577 | Allow from all | |
578 | </Proxy> |
|
578 | </Proxy> | |
579 |
|
579 | |||
580 | #important ! |
|
580 | #important ! | |
581 | #Directive to properly generate url (clone url) for pylons |
|
581 | #Directive to properly generate url (clone url) for pylons | |
582 | ProxyPreserveHost On |
|
582 | ProxyPreserveHost On | |
583 |
|
583 | |||
584 | #rhodecode instance |
|
584 | #rhodecode instance | |
585 | ProxyPass / http://127.0.0.1:5000/ |
|
585 | ProxyPass / http://127.0.0.1:5000/ | |
586 | ProxyPassReverse / http://127.0.0.1:5000/ |
|
586 | ProxyPassReverse / http://127.0.0.1:5000/ | |
587 |
|
587 | |||
588 | #to enable https use line below |
|
588 | #to enable https use line below | |
589 | #SetEnvIf X-Url-Scheme https HTTPS=1 |
|
589 | #SetEnvIf X-Url-Scheme https HTTPS=1 | |
590 |
|
590 | |||
591 | </VirtualHost> |
|
591 | </VirtualHost> | |
592 |
|
592 | |||
593 |
|
593 | |||
594 | Additional tutorial |
|
594 | Additional tutorial | |
595 | http://wiki.pylonshq.com/display/pylonscookbook/Apache+as+a+reverse+proxy+for+Pylons |
|
595 | http://wiki.pylonshq.com/display/pylonscookbook/Apache+as+a+reverse+proxy+for+Pylons | |
596 |
|
596 | |||
597 |
|
597 | |||
598 | Apache as subdirectory |
|
598 | Apache as subdirectory | |
599 | ---------------------- |
|
599 | ---------------------- | |
600 |
|
600 | |||
601 | Apache subdirectory part:: |
|
601 | Apache subdirectory part:: | |
602 |
|
602 | |||
603 | <Location /<someprefix> > |
|
603 | <Location /<someprefix> > | |
604 | ProxyPass http://127.0.0.1:5000/<someprefix> |
|
604 | ProxyPass http://127.0.0.1:5000/<someprefix> | |
605 | ProxyPassReverse http://127.0.0.1:5000/<someprefix> |
|
605 | ProxyPassReverse http://127.0.0.1:5000/<someprefix> | |
606 | SetEnvIf X-Url-Scheme https HTTPS=1 |
|
606 | SetEnvIf X-Url-Scheme https HTTPS=1 | |
607 | </Location> |
|
607 | </Location> | |
608 |
|
608 | |||
609 | Besides the regular apache setup you will need to add the following line |
|
609 | Besides the regular apache setup you will need to add the following line | |
610 | into [app:main] section of your .ini file:: |
|
610 | into [app:main] section of your .ini file:: | |
611 |
|
611 | |||
612 | filter-with = proxy-prefix |
|
612 | filter-with = proxy-prefix | |
613 |
|
613 | |||
614 | Add the following at the end of the .ini file:: |
|
614 | Add the following at the end of the .ini file:: | |
615 |
|
615 | |||
616 | [filter:proxy-prefix] |
|
616 | [filter:proxy-prefix] | |
617 | use = egg:PasteDeploy#prefix |
|
617 | use = egg:PasteDeploy#prefix | |
618 | prefix = /<someprefix> |
|
618 | prefix = /<someprefix> | |
619 |
|
619 | |||
620 |
|
620 | |||
621 | then change <someprefix> into your choosen prefix |
|
621 | then change <someprefix> into your choosen prefix | |
622 |
|
622 | |||
623 | Apache's WSGI config |
|
623 | Apache's WSGI config | |
624 | -------------------- |
|
624 | -------------------- | |
625 |
|
625 | |||
626 | Alternatively, RhodeCode can be set up with Apache under mod_wsgi. For |
|
626 | Alternatively, RhodeCode can be set up with Apache under mod_wsgi. For | |
627 | that, you'll need to: |
|
627 | that, you'll need to: | |
628 |
|
628 | |||
629 | - Install mod_wsgi. If using a Debian-based distro, you can install |
|
629 | - Install mod_wsgi. If using a Debian-based distro, you can install | |
630 | the package libapache2-mod-wsgi:: |
|
630 | the package libapache2-mod-wsgi:: | |
631 |
|
631 | |||
632 | aptitude install libapache2-mod-wsgi |
|
632 | aptitude install libapache2-mod-wsgi | |
633 |
|
633 | |||
634 | - Enable mod_wsgi:: |
|
634 | - Enable mod_wsgi:: | |
635 |
|
635 | |||
636 | a2enmod wsgi |
|
636 | a2enmod wsgi | |
637 |
|
637 | |||
638 | - Create a wsgi dispatch script, like the one below. Make sure you |
|
638 | - Create a wsgi dispatch script, like the one below. Make sure you | |
639 | check the paths correctly point to where you installed RhodeCode |
|
639 | check the paths correctly point to where you installed RhodeCode | |
640 | and its Python Virtual Environment. |
|
640 | and its Python Virtual Environment. | |
641 | - Enable the WSGIScriptAlias directive for the wsgi dispatch script, |
|
641 | - Enable the WSGIScriptAlias directive for the wsgi dispatch script, | |
642 | as in the following example. Once again, check the paths are |
|
642 | as in the following example. Once again, check the paths are | |
643 | correctly specified. |
|
643 | correctly specified. | |
644 |
|
644 | |||
645 | Here is a sample excerpt from an Apache Virtual Host configuration file:: |
|
645 | Here is a sample excerpt from an Apache Virtual Host configuration file:: | |
646 |
|
646 | |||
647 | WSGIDaemonProcess pylons user=www-data group=www-data processes=1 \ |
|
647 | WSGIDaemonProcess pylons user=www-data group=www-data processes=1 \ | |
648 | threads=4 \ |
|
648 | threads=4 \ | |
649 | python-path=/home/web/rhodecode/pyenv/lib/python2.6/site-packages |
|
649 | python-path=/home/web/rhodecode/pyenv/lib/python2.6/site-packages | |
650 | WSGIScriptAlias / /home/web/rhodecode/dispatch.wsgi |
|
650 | WSGIScriptAlias / /home/web/rhodecode/dispatch.wsgi | |
|
651 | WSGIPassAuthorization On | |||
651 |
|
652 | |||
652 | Example wsgi dispatch script:: |
|
653 | Example wsgi dispatch script:: | |
653 |
|
654 | |||
654 | import os |
|
655 | import os | |
655 | os.environ["HGENCODING"] = "UTF-8" |
|
656 | os.environ["HGENCODING"] = "UTF-8" | |
656 | os.environ['PYTHON_EGG_CACHE'] = '/home/web/rhodecode/.egg-cache' |
|
657 | os.environ['PYTHON_EGG_CACHE'] = '/home/web/rhodecode/.egg-cache' | |
657 |
|
658 | |||
658 | # sometimes it's needed to set the curent dir |
|
659 | # sometimes it's needed to set the curent dir | |
659 | os.chdir('/home/web/rhodecode/') |
|
660 | os.chdir('/home/web/rhodecode/') | |
660 |
|
661 | |||
661 | import site |
|
662 | import site | |
662 | site.addsitedir("/home/web/rhodecode/pyenv/lib/python2.6/site-packages") |
|
663 | site.addsitedir("/home/web/rhodecode/pyenv/lib/python2.6/site-packages") | |
663 |
|
664 | |||
664 | from paste.deploy import loadapp |
|
665 | from paste.deploy import loadapp | |
665 | from paste.script.util.logging_config import fileConfig |
|
666 | from paste.script.util.logging_config import fileConfig | |
666 |
|
667 | |||
667 | fileConfig('/home/web/rhodecode/production.ini') |
|
668 | fileConfig('/home/web/rhodecode/production.ini') | |
668 | application = loadapp('config:/home/web/rhodecode/production.ini') |
|
669 | application = loadapp('config:/home/web/rhodecode/production.ini') | |
669 |
|
670 | |||
670 | Note: when using mod_wsgi you'll need to install the same version of |
|
671 | Note: when using mod_wsgi you'll need to install the same version of | |
671 | Mercurial that's inside RhodeCode's virtualenv also on the system's Python |
|
672 | Mercurial that's inside RhodeCode's virtualenv also on the system's Python | |
672 | environment. |
|
673 | environment. | |
673 |
|
674 | |||
674 |
|
675 | |||
675 | Other configuration files |
|
676 | Other configuration files | |
676 | ------------------------- |
|
677 | ------------------------- | |
677 |
|
678 | |||
678 | Some example init.d scripts can be found here, for debian and gentoo: |
|
679 | Some example init.d scripts can be found here, for debian and gentoo: | |
679 |
|
680 | |||
680 | https://rhodecode.org/rhodecode/files/tip/init.d |
|
681 | https://rhodecode.org/rhodecode/files/tip/init.d | |
681 |
|
682 | |||
682 |
|
683 | |||
683 | Troubleshooting |
|
684 | Troubleshooting | |
684 | --------------- |
|
685 | --------------- | |
685 |
|
686 | |||
686 | :Q: **Missing static files?** |
|
687 | :Q: **Missing static files?** | |
687 | :A: Make sure either to set the `static_files = true` in the .ini file or |
|
688 | :A: Make sure either to set the `static_files = true` in the .ini file or | |
688 | double check the root path for your http setup. It should point to |
|
689 | double check the root path for your http setup. It should point to | |
689 | for example: |
|
690 | for example: | |
690 | /home/my-virtual-python/lib/python2.6/site-packages/rhodecode/public |
|
691 | /home/my-virtual-python/lib/python2.6/site-packages/rhodecode/public | |
691 |
|
692 | |||
692 | | |
|
693 | | | |
693 |
|
694 | |||
694 | :Q: **Can't install celery/rabbitmq** |
|
695 | :Q: **Can't install celery/rabbitmq** | |
695 | :A: Don't worry RhodeCode works without them too. No extra setup is required. |
|
696 | :A: Don't worry RhodeCode works without them too. No extra setup is required. | |
696 |
|
697 | |||
697 | | |
|
698 | | | |
698 |
|
699 | |||
699 | :Q: **Long lasting push timeouts?** |
|
700 | :Q: **Long lasting push timeouts?** | |
700 | :A: Make sure you set a longer timeouts in your proxy/fcgi settings, timeouts |
|
701 | :A: Make sure you set a longer timeouts in your proxy/fcgi settings, timeouts | |
701 | are caused by https server and not RhodeCode. |
|
702 | are caused by https server and not RhodeCode. | |
702 |
|
703 | |||
703 | | |
|
704 | | | |
704 |
|
705 | |||
705 | :Q: **Large pushes timeouts?** |
|
706 | :Q: **Large pushes timeouts?** | |
706 | :A: Make sure you set a proper max_body_size for the http server. |
|
707 | :A: Make sure you set a proper max_body_size for the http server. | |
707 |
|
708 | |||
708 | | |
|
709 | | | |
709 |
|
710 | |||
710 | :Q: **Apache doesn't pass basicAuth on pull/push?** |
|
711 | :Q: **Apache doesn't pass basicAuth on pull/push?** | |
711 | :A: Make sure you added `WSGIPassAuthorization true`. |
|
712 | :A: Make sure you added `WSGIPassAuthorization true`. | |
712 |
|
713 | |||
713 | For further questions search the `Issues tracker`_, or post a message in the |
|
714 | For further questions search the `Issues tracker`_, or post a message in the | |
714 | `google group rhodecode`_ |
|
715 | `google group rhodecode`_ | |
715 |
|
716 | |||
716 | .. _virtualenv: http://pypi.python.org/pypi/virtualenv |
|
717 | .. _virtualenv: http://pypi.python.org/pypi/virtualenv | |
717 | .. _python: http://www.python.org/ |
|
718 | .. _python: http://www.python.org/ | |
718 | .. _mercurial: http://mercurial.selenic.com/ |
|
719 | .. _mercurial: http://mercurial.selenic.com/ | |
719 | .. _celery: http://celeryproject.org/ |
|
720 | .. _celery: http://celeryproject.org/ | |
720 | .. _rabbitmq: http://www.rabbitmq.com/ |
|
721 | .. _rabbitmq: http://www.rabbitmq.com/ | |
721 | .. _python-ldap: http://www.python-ldap.org/ |
|
722 | .. _python-ldap: http://www.python-ldap.org/ | |
722 | .. _mercurial-server: http://www.lshift.net/mercurial-server.html |
|
723 | .. _mercurial-server: http://www.lshift.net/mercurial-server.html | |
723 | .. _PublishingRepositories: http://mercurial.selenic.com/wiki/PublishingRepositories |
|
724 | .. _PublishingRepositories: http://mercurial.selenic.com/wiki/PublishingRepositories | |
724 | .. _Issues tracker: https://bitbucket.org/marcinkuzminski/rhodecode/issues |
|
725 | .. _Issues tracker: https://bitbucket.org/marcinkuzminski/rhodecode/issues | |
725 | .. _google group rhodecode: http://groups.google.com/group/rhodecode |
|
726 | .. _google group rhodecode: http://groups.google.com/group/rhodecode |
@@ -1,17 +1,17 b'' | |||||
1 | Pylons==1.0.0 |
|
1 | Pylons==1.0.0 | |
2 |
Beaker==1.6. |
|
2 | Beaker==1.6.3 | |
3 | WebHelpers>=1.2 |
|
3 | WebHelpers>=1.2 | |
4 | formencode==1.2.4 |
|
4 | formencode==1.2.4 | |
5 | SQLAlchemy==0.7.4 |
|
5 | SQLAlchemy==0.7.4 | |
6 | Mako==0.5.0 |
|
6 | Mako==0.5.0 | |
7 | pygments>=1.4 |
|
7 | pygments>=1.4 | |
8 | whoosh>=2.3.0,<2.4 |
|
8 | whoosh>=2.3.0,<2.4 | |
9 | celery>=2.2.5,<2.3 |
|
9 | celery>=2.2.5,<2.3 | |
10 | babel |
|
10 | babel | |
11 | python-dateutil>=1.5.0,<2.0.0 |
|
11 | python-dateutil>=1.5.0,<2.0.0 | |
12 | dulwich>=0.8.0,<0.9.0 |
|
12 | dulwich>=0.8.0,<0.9.0 | |
13 | webob==1.0.8 |
|
13 | webob==1.0.8 | |
14 | markdown==2.1.1 |
|
14 | markdown==2.1.1 | |
15 | docutils==0.8.1 |
|
15 | docutils==0.8.1 | |
16 | py-bcrypt |
|
16 | py-bcrypt | |
17 | mercurial>=2.1,<2.2 No newline at end of file |
|
17 | mercurial>=2.1,<2.2 |
@@ -1,92 +1,92 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | """ |
|
2 | """ | |
3 | rhodecode.__init__ |
|
3 | rhodecode.__init__ | |
4 | ~~~~~~~~~~~~~~~~~~ |
|
4 | ~~~~~~~~~~~~~~~~~~ | |
5 |
|
5 | |||
6 | RhodeCode, a web based repository management based on pylons |
|
6 | RhodeCode, a web based repository management based on pylons | |
7 | versioning implementation: http://semver.org/ |
|
7 | versioning implementation: http://semver.org/ | |
8 |
|
8 | |||
9 | :created_on: Apr 9, 2010 |
|
9 | :created_on: Apr 9, 2010 | |
10 | :author: marcink |
|
10 | :author: marcink | |
11 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> |
|
11 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> | |
12 | :license: GPLv3, see COPYING for more details. |
|
12 | :license: GPLv3, see COPYING for more details. | |
13 | """ |
|
13 | """ | |
14 | # This program is free software: you can redistribute it and/or modify |
|
14 | # This program is free software: you can redistribute it and/or modify | |
15 | # it under the terms of the GNU General Public License as published by |
|
15 | # it under the terms of the GNU General Public License as published by | |
16 | # the Free Software Foundation, either version 3 of the License, or |
|
16 | # the Free Software Foundation, either version 3 of the License, or | |
17 | # (at your option) any later version. |
|
17 | # (at your option) any later version. | |
18 | # |
|
18 | # | |
19 | # This program is distributed in the hope that it will be useful, |
|
19 | # This program is distributed in the hope that it will be useful, | |
20 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
20 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
21 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
21 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
22 | # GNU General Public License for more details. |
|
22 | # GNU General Public License for more details. | |
23 | # |
|
23 | # | |
24 | # You should have received a copy of the GNU General Public License |
|
24 | # You should have received a copy of the GNU General Public License | |
25 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
25 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
26 | import sys |
|
26 | import sys | |
27 | import platform |
|
27 | import platform | |
28 |
|
28 | |||
29 | VERSION = (1, 3, 2) |
|
29 | VERSION = (1, 3, 2) | |
30 | __version__ = '.'.join((str(each) for each in VERSION[:4])) |
|
30 | __version__ = '.'.join((str(each) for each in VERSION[:4])) | |
31 | __dbversion__ = 5 # defines current db version for migrations |
|
31 | __dbversion__ = 5 # defines current db version for migrations | |
32 | __platform__ = platform.system() |
|
32 | __platform__ = platform.system() | |
33 | __license__ = 'GPLv3' |
|
33 | __license__ = 'GPLv3' | |
34 | __py_version__ = sys.version_info |
|
34 | __py_version__ = sys.version_info | |
35 |
|
35 | |||
36 | PLATFORM_WIN = ('Windows') |
|
36 | PLATFORM_WIN = ('Windows') | |
37 | PLATFORM_OTHERS = ('Linux', 'Darwin', 'FreeBSD', 'OpenBSD', 'SunOS') |
|
37 | PLATFORM_OTHERS = ('Linux', 'Darwin', 'FreeBSD', 'OpenBSD', 'SunOS') | |
38 |
|
38 | |||
39 | requirements = [ |
|
39 | requirements = [ | |
40 | "Pylons==1.0.0", |
|
40 | "Pylons==1.0.0", | |
41 |
"Beaker==1.6. |
|
41 | "Beaker==1.6.3", | |
42 | "WebHelpers>=1.2", |
|
42 | "WebHelpers>=1.2", | |
43 | "formencode==1.2.4", |
|
43 | "formencode==1.2.4", | |
44 | "SQLAlchemy==0.7.4", |
|
44 | "SQLAlchemy==0.7.4", | |
45 | "Mako==0.5.0", |
|
45 | "Mako==0.5.0", | |
46 | "pygments>=1.4", |
|
46 | "pygments>=1.4", | |
47 | "whoosh>=2.3.0,<2.4", |
|
47 | "whoosh>=2.3.0,<2.4", | |
48 | "celery>=2.2.5,<2.3", |
|
48 | "celery>=2.2.5,<2.3", | |
49 | "babel", |
|
49 | "babel", | |
50 | "python-dateutil>=1.5.0,<2.0.0", |
|
50 | "python-dateutil>=1.5.0,<2.0.0", | |
51 | "dulwich>=0.8.0,<0.9.0", |
|
51 | "dulwich>=0.8.0,<0.9.0", | |
52 | "webob==1.0.8", |
|
52 | "webob==1.0.8", | |
53 | "markdown==2.1.1", |
|
53 | "markdown==2.1.1", | |
54 | "docutils==0.8.1", |
|
54 | "docutils==0.8.1", | |
55 | ] |
|
55 | ] | |
56 |
|
56 | |||
57 | if __py_version__ < (2, 6): |
|
57 | if __py_version__ < (2, 6): | |
58 | requirements.append("simplejson") |
|
58 | requirements.append("simplejson") | |
59 | requirements.append("pysqlite") |
|
59 | requirements.append("pysqlite") | |
60 |
|
60 | |||
61 | if __platform__ in PLATFORM_WIN: |
|
61 | if __platform__ in PLATFORM_WIN: | |
62 | requirements.append("mercurial>=2.1,<2.2") |
|
62 | requirements.append("mercurial>=2.1,<2.2") | |
63 | else: |
|
63 | else: | |
64 | requirements.append("py-bcrypt") |
|
64 | requirements.append("py-bcrypt") | |
65 | requirements.append("mercurial>=2.1,<2.2") |
|
65 | requirements.append("mercurial>=2.1,<2.2") | |
66 |
|
66 | |||
67 |
|
67 | |||
68 | try: |
|
68 | try: | |
69 | from rhodecode.lib import get_current_revision |
|
69 | from rhodecode.lib import get_current_revision | |
70 | _rev = get_current_revision(quiet=True) |
|
70 | _rev = get_current_revision(quiet=True) | |
71 | except ImportError: |
|
71 | except ImportError: | |
72 | # this is needed when doing some setup.py operations |
|
72 | # this is needed when doing some setup.py operations | |
73 | _rev = False |
|
73 | _rev = False | |
74 |
|
74 | |||
75 | if len(VERSION) > 3 and _rev: |
|
75 | if len(VERSION) > 3 and _rev: | |
76 | __version__ += ' [rev:%s]' % _rev[0] |
|
76 | __version__ += ' [rev:%s]' % _rev[0] | |
77 |
|
77 | |||
78 |
|
78 | |||
79 | def get_version(): |
|
79 | def get_version(): | |
80 | """Returns shorter version (digit parts only) as string.""" |
|
80 | """Returns shorter version (digit parts only) as string.""" | |
81 |
|
81 | |||
82 | return '.'.join((str(each) for each in VERSION[:3])) |
|
82 | return '.'.join((str(each) for each in VERSION[:3])) | |
83 |
|
83 | |||
84 | BACKENDS = { |
|
84 | BACKENDS = { | |
85 | 'hg': 'Mercurial repository', |
|
85 | 'hg': 'Mercurial repository', | |
86 | 'git': 'Git repository', |
|
86 | 'git': 'Git repository', | |
87 | } |
|
87 | } | |
88 |
|
88 | |||
89 | CELERY_ON = False |
|
89 | CELERY_ON = False | |
90 |
|
90 | |||
91 | # link to config for pylons |
|
91 | # link to config for pylons | |
92 | CONFIG = {} |
|
92 | CONFIG = {} |
@@ -1,414 +1,414 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | """ |
|
2 | """ | |
3 | rhodecode.controllers.admin.settings |
|
3 | rhodecode.controllers.admin.settings | |
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
5 |
|
5 | |||
6 | settings controller for rhodecode admin |
|
6 | settings controller for rhodecode admin | |
7 |
|
7 | |||
8 | :created_on: Jul 14, 2010 |
|
8 | :created_on: Jul 14, 2010 | |
9 | :author: marcink |
|
9 | :author: marcink | |
10 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> |
|
10 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> | |
11 | :license: GPLv3, see COPYING for more details. |
|
11 | :license: GPLv3, see COPYING for more details. | |
12 | """ |
|
12 | """ | |
13 | # This program is free software: you can redistribute it and/or modify |
|
13 | # This program is free software: you can redistribute it and/or modify | |
14 | # it under the terms of the GNU General Public License as published by |
|
14 | # it under the terms of the GNU General Public License as published by | |
15 | # the Free Software Foundation, either version 3 of the License, or |
|
15 | # the Free Software Foundation, either version 3 of the License, or | |
16 | # (at your option) any later version. |
|
16 | # (at your option) any later version. | |
17 | # |
|
17 | # | |
18 | # This program is distributed in the hope that it will be useful, |
|
18 | # This program is distributed in the hope that it will be useful, | |
19 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
19 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
20 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
20 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
21 | # GNU General Public License for more details. |
|
21 | # GNU General Public License for more details. | |
22 | # |
|
22 | # | |
23 | # You should have received a copy of the GNU General Public License |
|
23 | # You should have received a copy of the GNU General Public License | |
24 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
24 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
25 |
|
25 | |||
26 | import logging |
|
26 | import logging | |
27 | import traceback |
|
27 | import traceback | |
28 | import formencode |
|
28 | import formencode | |
29 |
|
29 | |||
30 | from sqlalchemy import func |
|
30 | from sqlalchemy import func | |
31 | from formencode import htmlfill |
|
31 | from formencode import htmlfill | |
32 | from pylons import request, session, tmpl_context as c, url, config |
|
32 | from pylons import request, session, tmpl_context as c, url, config | |
33 | from pylons.controllers.util import abort, redirect |
|
33 | from pylons.controllers.util import abort, redirect | |
34 | from pylons.i18n.translation import _ |
|
34 | from pylons.i18n.translation import _ | |
35 |
|
35 | |||
36 | from rhodecode.lib import helpers as h |
|
36 | from rhodecode.lib import helpers as h | |
37 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator, \ |
|
37 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator, \ | |
38 | HasPermissionAnyDecorator, NotAnonymous |
|
38 | HasPermissionAnyDecorator, NotAnonymous | |
39 | from rhodecode.lib.base import BaseController, render |
|
39 | from rhodecode.lib.base import BaseController, render | |
40 | from rhodecode.lib.celerylib import tasks, run_task |
|
40 | from rhodecode.lib.celerylib import tasks, run_task | |
41 | from rhodecode.lib.utils import repo2db_mapper, invalidate_cache, \ |
|
41 | from rhodecode.lib.utils import repo2db_mapper, invalidate_cache, \ | |
42 | set_rhodecode_config, repo_name_slug |
|
42 | set_rhodecode_config, repo_name_slug | |
43 | from rhodecode.model.db import RhodeCodeUi, Repository, RepoGroup, \ |
|
43 | from rhodecode.model.db import RhodeCodeUi, Repository, RepoGroup, \ | |
44 | RhodeCodeSetting |
|
44 | RhodeCodeSetting | |
45 | from rhodecode.model.forms import UserForm, ApplicationSettingsForm, \ |
|
45 | from rhodecode.model.forms import UserForm, ApplicationSettingsForm, \ | |
46 | ApplicationUiSettingsForm |
|
46 | ApplicationUiSettingsForm | |
47 | from rhodecode.model.scm import ScmModel |
|
47 | from rhodecode.model.scm import ScmModel | |
48 | from rhodecode.model.user import UserModel |
|
48 | from rhodecode.model.user import UserModel | |
49 | from rhodecode.model.db import User |
|
49 | from rhodecode.model.db import User | |
50 | from rhodecode.model.notification import EmailNotificationModel |
|
50 | from rhodecode.model.notification import EmailNotificationModel | |
51 | from rhodecode.model.meta import Session |
|
51 | from rhodecode.model.meta import Session | |
52 |
|
52 | |||
53 | log = logging.getLogger(__name__) |
|
53 | log = logging.getLogger(__name__) | |
54 |
|
54 | |||
55 |
|
55 | |||
56 | class SettingsController(BaseController): |
|
56 | class SettingsController(BaseController): | |
57 | """REST Controller styled on the Atom Publishing Protocol""" |
|
57 | """REST Controller styled on the Atom Publishing Protocol""" | |
58 | # To properly map this controller, ensure your config/routing.py |
|
58 | # To properly map this controller, ensure your config/routing.py | |
59 | # file has a resource setup: |
|
59 | # file has a resource setup: | |
60 | # map.resource('setting', 'settings', controller='admin/settings', |
|
60 | # map.resource('setting', 'settings', controller='admin/settings', | |
61 | # path_prefix='/admin', name_prefix='admin_') |
|
61 | # path_prefix='/admin', name_prefix='admin_') | |
62 |
|
62 | |||
63 | @LoginRequired() |
|
63 | @LoginRequired() | |
64 | def __before__(self): |
|
64 | def __before__(self): | |
65 | c.admin_user = session.get('admin_user') |
|
65 | c.admin_user = session.get('admin_user') | |
66 | c.admin_username = session.get('admin_username') |
|
66 | c.admin_username = session.get('admin_username') | |
67 | super(SettingsController, self).__before__() |
|
67 | super(SettingsController, self).__before__() | |
68 |
|
68 | |||
69 | @HasPermissionAllDecorator('hg.admin') |
|
69 | @HasPermissionAllDecorator('hg.admin') | |
70 | def index(self, format='html'): |
|
70 | def index(self, format='html'): | |
71 | """GET /admin/settings: All items in the collection""" |
|
71 | """GET /admin/settings: All items in the collection""" | |
72 | # url('admin_settings') |
|
72 | # url('admin_settings') | |
73 |
|
73 | |||
74 | defaults = RhodeCodeSetting.get_app_settings() |
|
74 | defaults = RhodeCodeSetting.get_app_settings() | |
75 | defaults.update(self.get_hg_ui_settings()) |
|
75 | defaults.update(self.get_hg_ui_settings()) | |
76 | return htmlfill.render( |
|
76 | return htmlfill.render( | |
77 | render('admin/settings/settings.html'), |
|
77 | render('admin/settings/settings.html'), | |
78 | defaults=defaults, |
|
78 | defaults=defaults, | |
79 | encoding="UTF-8", |
|
79 | encoding="UTF-8", | |
80 | force_defaults=False |
|
80 | force_defaults=False | |
81 | ) |
|
81 | ) | |
82 |
|
82 | |||
83 | @HasPermissionAllDecorator('hg.admin') |
|
83 | @HasPermissionAllDecorator('hg.admin') | |
84 | def create(self): |
|
84 | def create(self): | |
85 | """POST /admin/settings: Create a new item""" |
|
85 | """POST /admin/settings: Create a new item""" | |
86 | # url('admin_settings') |
|
86 | # url('admin_settings') | |
87 |
|
87 | |||
88 | @HasPermissionAllDecorator('hg.admin') |
|
88 | @HasPermissionAllDecorator('hg.admin') | |
89 | def new(self, format='html'): |
|
89 | def new(self, format='html'): | |
90 | """GET /admin/settings/new: Form to create a new item""" |
|
90 | """GET /admin/settings/new: Form to create a new item""" | |
91 | # url('admin_new_setting') |
|
91 | # url('admin_new_setting') | |
92 |
|
92 | |||
93 | @HasPermissionAllDecorator('hg.admin') |
|
93 | @HasPermissionAllDecorator('hg.admin') | |
94 | def update(self, setting_id): |
|
94 | def update(self, setting_id): | |
95 | """PUT /admin/settings/setting_id: Update an existing item""" |
|
95 | """PUT /admin/settings/setting_id: Update an existing item""" | |
96 | # Forms posted to this method should contain a hidden field: |
|
96 | # Forms posted to this method should contain a hidden field: | |
97 | # <input type="hidden" name="_method" value="PUT" /> |
|
97 | # <input type="hidden" name="_method" value="PUT" /> | |
98 | # Or using helpers: |
|
98 | # Or using helpers: | |
99 | # h.form(url('admin_setting', setting_id=ID), |
|
99 | # h.form(url('admin_setting', setting_id=ID), | |
100 | # method='put') |
|
100 | # method='put') | |
101 | # url('admin_setting', setting_id=ID) |
|
101 | # url('admin_setting', setting_id=ID) | |
102 | if setting_id == 'mapping': |
|
102 | if setting_id == 'mapping': | |
103 | rm_obsolete = request.POST.get('destroy', False) |
|
103 | rm_obsolete = request.POST.get('destroy', False) | |
104 | log.debug('Rescanning directories with destroy=%s' % rm_obsolete) |
|
104 | log.debug('Rescanning directories with destroy=%s' % rm_obsolete) | |
105 | initial = ScmModel().repo_scan() |
|
105 | initial = ScmModel().repo_scan() | |
106 | log.debug('invalidating all repositories') |
|
106 | log.debug('invalidating all repositories') | |
107 | for repo_name in initial.keys(): |
|
107 | for repo_name in initial.keys(): | |
108 | invalidate_cache('get_repo_cached_%s' % repo_name) |
|
108 | invalidate_cache('get_repo_cached_%s' % repo_name) | |
109 |
|
109 | |||
110 | added, removed = repo2db_mapper(initial, rm_obsolete) |
|
110 | added, removed = repo2db_mapper(initial, rm_obsolete) | |
111 |
|
111 | |||
112 | h.flash(_('Repositories successfully' |
|
112 | h.flash(_('Repositories successfully' | |
113 | ' rescanned added: %s,removed: %s') % (added, removed), |
|
113 | ' rescanned added: %s,removed: %s') % (added, removed), | |
114 | category='success') |
|
114 | category='success') | |
115 |
|
115 | |||
116 | if setting_id == 'whoosh': |
|
116 | if setting_id == 'whoosh': | |
117 | repo_location = self.get_hg_ui_settings()['paths_root_path'] |
|
117 | repo_location = self.get_hg_ui_settings()['paths_root_path'] | |
118 | full_index = request.POST.get('full_index', False) |
|
118 | full_index = request.POST.get('full_index', False) | |
119 | run_task(tasks.whoosh_index, repo_location, full_index) |
|
119 | run_task(tasks.whoosh_index, repo_location, full_index) | |
120 |
|
120 | |||
121 | h.flash(_('Whoosh reindex task scheduled'), category='success') |
|
121 | h.flash(_('Whoosh reindex task scheduled'), category='success') | |
122 | if setting_id == 'global': |
|
122 | if setting_id == 'global': | |
123 |
|
123 | |||
124 | application_form = ApplicationSettingsForm()() |
|
124 | application_form = ApplicationSettingsForm()() | |
125 | try: |
|
125 | try: | |
126 | form_result = application_form.to_python(dict(request.POST)) |
|
126 | form_result = application_form.to_python(dict(request.POST)) | |
127 |
|
127 | |||
128 | try: |
|
128 | try: | |
129 | hgsettings1 = RhodeCodeSetting.get_by_name('title') |
|
129 | hgsettings1 = RhodeCodeSetting.get_by_name('title') | |
130 | hgsettings1.app_settings_value = \ |
|
130 | hgsettings1.app_settings_value = \ | |
131 | form_result['rhodecode_title'] |
|
131 | form_result['rhodecode_title'] | |
132 |
|
132 | |||
133 | hgsettings2 = RhodeCodeSetting.get_by_name('realm') |
|
133 | hgsettings2 = RhodeCodeSetting.get_by_name('realm') | |
134 | hgsettings2.app_settings_value = \ |
|
134 | hgsettings2.app_settings_value = \ | |
135 | form_result['rhodecode_realm'] |
|
135 | form_result['rhodecode_realm'] | |
136 |
|
136 | |||
137 | hgsettings3 = RhodeCodeSetting.get_by_name('ga_code') |
|
137 | hgsettings3 = RhodeCodeSetting.get_by_name('ga_code') | |
138 | hgsettings3.app_settings_value = \ |
|
138 | hgsettings3.app_settings_value = \ | |
139 | form_result['rhodecode_ga_code'] |
|
139 | form_result['rhodecode_ga_code'] | |
140 |
|
140 | |||
141 | self.sa.add(hgsettings1) |
|
141 | self.sa.add(hgsettings1) | |
142 | self.sa.add(hgsettings2) |
|
142 | self.sa.add(hgsettings2) | |
143 | self.sa.add(hgsettings3) |
|
143 | self.sa.add(hgsettings3) | |
144 | self.sa.commit() |
|
144 | self.sa.commit() | |
145 | set_rhodecode_config(config) |
|
145 | set_rhodecode_config(config) | |
146 | h.flash(_('Updated application settings'), |
|
146 | h.flash(_('Updated application settings'), | |
147 | category='success') |
|
147 | category='success') | |
148 |
|
148 | |||
149 | except Exception: |
|
149 | except Exception: | |
150 | log.error(traceback.format_exc()) |
|
150 | log.error(traceback.format_exc()) | |
151 | h.flash(_('error occurred during updating ' |
|
151 | h.flash(_('error occurred during updating ' | |
152 | 'application settings'), |
|
152 | 'application settings'), | |
153 | category='error') |
|
153 | category='error') | |
154 |
|
154 | |||
155 | self.sa.rollback() |
|
155 | self.sa.rollback() | |
156 |
|
156 | |||
157 | except formencode.Invalid, errors: |
|
157 | except formencode.Invalid, errors: | |
158 | return htmlfill.render( |
|
158 | return htmlfill.render( | |
159 | render('admin/settings/settings.html'), |
|
159 | render('admin/settings/settings.html'), | |
160 | defaults=errors.value, |
|
160 | defaults=errors.value, | |
161 | errors=errors.error_dict or {}, |
|
161 | errors=errors.error_dict or {}, | |
162 | prefix_error=False, |
|
162 | prefix_error=False, | |
163 | encoding="UTF-8") |
|
163 | encoding="UTF-8") | |
164 |
|
164 | |||
165 | if setting_id == 'mercurial': |
|
165 | if setting_id == 'mercurial': | |
166 | application_form = ApplicationUiSettingsForm()() |
|
166 | application_form = ApplicationUiSettingsForm()() | |
167 | try: |
|
167 | try: | |
168 | form_result = application_form.to_python(dict(request.POST)) |
|
168 | form_result = application_form.to_python(dict(request.POST)) | |
169 |
|
169 | |||
170 | try: |
|
170 | try: | |
171 |
|
171 | |||
172 | hgsettings1 = self.sa.query(RhodeCodeUi)\ |
|
172 | hgsettings1 = self.sa.query(RhodeCodeUi)\ | |
173 | .filter(RhodeCodeUi.ui_key == 'push_ssl').one() |
|
173 | .filter(RhodeCodeUi.ui_key == 'push_ssl').one() | |
174 | hgsettings1.ui_value = form_result['web_push_ssl'] |
|
174 | hgsettings1.ui_value = form_result['web_push_ssl'] | |
175 |
|
175 | |||
176 | hgsettings2 = self.sa.query(RhodeCodeUi)\ |
|
176 | hgsettings2 = self.sa.query(RhodeCodeUi)\ | |
177 | .filter(RhodeCodeUi.ui_key == '/').one() |
|
177 | .filter(RhodeCodeUi.ui_key == '/').one() | |
178 | hgsettings2.ui_value = form_result['paths_root_path'] |
|
178 | hgsettings2.ui_value = form_result['paths_root_path'] | |
179 |
|
179 | |||
180 | #HOOKS |
|
180 | #HOOKS | |
181 | hgsettings3 = self.sa.query(RhodeCodeUi)\ |
|
181 | hgsettings3 = self.sa.query(RhodeCodeUi)\ | |
182 | .filter(RhodeCodeUi.ui_key == 'changegroup.update').one() |
|
182 | .filter(RhodeCodeUi.ui_key == 'changegroup.update').one() | |
183 | hgsettings3.ui_active = \ |
|
183 | hgsettings3.ui_active = \ | |
184 | bool(form_result['hooks_changegroup_update']) |
|
184 | bool(form_result['hooks_changegroup_update']) | |
185 |
|
185 | |||
186 | hgsettings4 = self.sa.query(RhodeCodeUi)\ |
|
186 | hgsettings4 = self.sa.query(RhodeCodeUi)\ | |
187 | .filter(RhodeCodeUi.ui_key == |
|
187 | .filter(RhodeCodeUi.ui_key == | |
188 | 'changegroup.repo_size').one() |
|
188 | 'changegroup.repo_size').one() | |
189 | hgsettings4.ui_active = \ |
|
189 | hgsettings4.ui_active = \ | |
190 | bool(form_result['hooks_changegroup_repo_size']) |
|
190 | bool(form_result['hooks_changegroup_repo_size']) | |
191 |
|
191 | |||
192 | hgsettings5 = self.sa.query(RhodeCodeUi)\ |
|
192 | hgsettings5 = self.sa.query(RhodeCodeUi)\ | |
193 | .filter(RhodeCodeUi.ui_key == |
|
193 | .filter(RhodeCodeUi.ui_key == | |
194 | 'pretxnchangegroup.push_logger').one() |
|
194 | 'pretxnchangegroup.push_logger').one() | |
195 | hgsettings5.ui_active = \ |
|
195 | hgsettings5.ui_active = \ | |
196 | bool(form_result['hooks_pretxnchangegroup' |
|
196 | bool(form_result['hooks_pretxnchangegroup' | |
197 | '_push_logger']) |
|
197 | '_push_logger']) | |
198 |
|
198 | |||
199 | hgsettings6 = self.sa.query(RhodeCodeUi)\ |
|
199 | hgsettings6 = self.sa.query(RhodeCodeUi)\ | |
200 | .filter(RhodeCodeUi.ui_key == |
|
200 | .filter(RhodeCodeUi.ui_key == | |
201 | 'preoutgoing.pull_logger').one() |
|
201 | 'preoutgoing.pull_logger').one() | |
202 | hgsettings6.ui_active = \ |
|
202 | hgsettings6.ui_active = \ | |
203 | bool(form_result['hooks_preoutgoing_pull_logger']) |
|
203 | bool(form_result['hooks_preoutgoing_pull_logger']) | |
204 |
|
204 | |||
205 | self.sa.add(hgsettings1) |
|
205 | self.sa.add(hgsettings1) | |
206 | self.sa.add(hgsettings2) |
|
206 | self.sa.add(hgsettings2) | |
207 | self.sa.add(hgsettings3) |
|
207 | self.sa.add(hgsettings3) | |
208 | self.sa.add(hgsettings4) |
|
208 | self.sa.add(hgsettings4) | |
209 | self.sa.add(hgsettings5) |
|
209 | self.sa.add(hgsettings5) | |
210 | self.sa.add(hgsettings6) |
|
210 | self.sa.add(hgsettings6) | |
211 | self.sa.commit() |
|
211 | self.sa.commit() | |
212 |
|
212 | |||
213 | h.flash(_('Updated mercurial settings'), |
|
213 | h.flash(_('Updated mercurial settings'), | |
214 | category='success') |
|
214 | category='success') | |
215 |
|
215 | |||
216 | except: |
|
216 | except: | |
217 | log.error(traceback.format_exc()) |
|
217 | log.error(traceback.format_exc()) | |
218 | h.flash(_('error occurred during updating ' |
|
218 | h.flash(_('error occurred during updating ' | |
219 | 'application settings'), category='error') |
|
219 | 'application settings'), category='error') | |
220 |
|
220 | |||
221 | self.sa.rollback() |
|
221 | self.sa.rollback() | |
222 |
|
222 | |||
223 | except formencode.Invalid, errors: |
|
223 | except formencode.Invalid, errors: | |
224 | return htmlfill.render( |
|
224 | return htmlfill.render( | |
225 | render('admin/settings/settings.html'), |
|
225 | render('admin/settings/settings.html'), | |
226 | defaults=errors.value, |
|
226 | defaults=errors.value, | |
227 | errors=errors.error_dict or {}, |
|
227 | errors=errors.error_dict or {}, | |
228 | prefix_error=False, |
|
228 | prefix_error=False, | |
229 | encoding="UTF-8") |
|
229 | encoding="UTF-8") | |
230 |
|
230 | |||
231 | if setting_id == 'hooks': |
|
231 | if setting_id == 'hooks': | |
232 | ui_key = request.POST.get('new_hook_ui_key') |
|
232 | ui_key = request.POST.get('new_hook_ui_key') | |
233 | ui_value = request.POST.get('new_hook_ui_value') |
|
233 | ui_value = request.POST.get('new_hook_ui_value') | |
234 | try: |
|
234 | try: | |
235 |
|
235 | |||
236 | if ui_value and ui_key: |
|
236 | if ui_value and ui_key: | |
237 | RhodeCodeUi.create_or_update_hook(ui_key, ui_value) |
|
237 | RhodeCodeUi.create_or_update_hook(ui_key, ui_value) | |
238 | h.flash(_('Added new hook'), |
|
238 | h.flash(_('Added new hook'), | |
239 | category='success') |
|
239 | category='success') | |
240 |
|
240 | |||
241 | # check for edits |
|
241 | # check for edits | |
242 | update = False |
|
242 | update = False | |
243 | _d = request.POST.dict_of_lists() |
|
243 | _d = request.POST.dict_of_lists() | |
244 | for k, v in zip(_d.get('hook_ui_key', []), |
|
244 | for k, v in zip(_d.get('hook_ui_key', []), | |
245 | _d.get('hook_ui_value_new', [])): |
|
245 | _d.get('hook_ui_value_new', [])): | |
246 | RhodeCodeUi.create_or_update_hook(k, v) |
|
246 | RhodeCodeUi.create_or_update_hook(k, v) | |
247 | update = True |
|
247 | update = True | |
248 |
|
248 | |||
249 | if update: |
|
249 | if update: | |
250 | h.flash(_('Updated hooks'), category='success') |
|
250 | h.flash(_('Updated hooks'), category='success') | |
251 |
|
|
251 | self.sa.commit() | |
252 | except: |
|
252 | except: | |
253 | log.error(traceback.format_exc()) |
|
253 | log.error(traceback.format_exc()) | |
254 | h.flash(_('error occurred during hook creation'), |
|
254 | h.flash(_('error occurred during hook creation'), | |
255 | category='error') |
|
255 | category='error') | |
256 |
|
256 | |||
257 | return redirect(url('admin_edit_setting', setting_id='hooks')) |
|
257 | return redirect(url('admin_edit_setting', setting_id='hooks')) | |
258 |
|
258 | |||
259 | if setting_id == 'email': |
|
259 | if setting_id == 'email': | |
260 | test_email = request.POST.get('test_email') |
|
260 | test_email = request.POST.get('test_email') | |
261 | test_email_subj = 'RhodeCode TestEmail' |
|
261 | test_email_subj = 'RhodeCode TestEmail' | |
262 | test_email_body = 'RhodeCode Email test' |
|
262 | test_email_body = 'RhodeCode Email test' | |
263 |
|
263 | |||
264 | test_email_html_body = EmailNotificationModel()\ |
|
264 | test_email_html_body = EmailNotificationModel()\ | |
265 | .get_email_tmpl(EmailNotificationModel.TYPE_DEFAULT, |
|
265 | .get_email_tmpl(EmailNotificationModel.TYPE_DEFAULT, | |
266 | body=test_email_body) |
|
266 | body=test_email_body) | |
267 |
|
267 | |||
268 | recipients = [test_email] if [test_email] else None |
|
268 | recipients = [test_email] if [test_email] else None | |
269 |
|
269 | |||
270 | run_task(tasks.send_email, recipients, test_email_subj, |
|
270 | run_task(tasks.send_email, recipients, test_email_subj, | |
271 | test_email_body, test_email_html_body) |
|
271 | test_email_body, test_email_html_body) | |
272 |
|
272 | |||
273 | h.flash(_('Email task created'), category='success') |
|
273 | h.flash(_('Email task created'), category='success') | |
274 | return redirect(url('admin_settings')) |
|
274 | return redirect(url('admin_settings')) | |
275 |
|
275 | |||
276 | @HasPermissionAllDecorator('hg.admin') |
|
276 | @HasPermissionAllDecorator('hg.admin') | |
277 | def delete(self, setting_id): |
|
277 | def delete(self, setting_id): | |
278 | """DELETE /admin/settings/setting_id: Delete an existing item""" |
|
278 | """DELETE /admin/settings/setting_id: Delete an existing item""" | |
279 | # Forms posted to this method should contain a hidden field: |
|
279 | # Forms posted to this method should contain a hidden field: | |
280 | # <input type="hidden" name="_method" value="DELETE" /> |
|
280 | # <input type="hidden" name="_method" value="DELETE" /> | |
281 | # Or using helpers: |
|
281 | # Or using helpers: | |
282 | # h.form(url('admin_setting', setting_id=ID), |
|
282 | # h.form(url('admin_setting', setting_id=ID), | |
283 | # method='delete') |
|
283 | # method='delete') | |
284 | # url('admin_setting', setting_id=ID) |
|
284 | # url('admin_setting', setting_id=ID) | |
285 | if setting_id == 'hooks': |
|
285 | if setting_id == 'hooks': | |
286 | hook_id = request.POST.get('hook_id') |
|
286 | hook_id = request.POST.get('hook_id') | |
287 | RhodeCodeUi.delete(hook_id) |
|
287 | RhodeCodeUi.delete(hook_id) | |
288 |
|
288 | self.sa.commit() | ||
289 |
|
289 | |||
290 | @HasPermissionAllDecorator('hg.admin') |
|
290 | @HasPermissionAllDecorator('hg.admin') | |
291 | def show(self, setting_id, format='html'): |
|
291 | def show(self, setting_id, format='html'): | |
292 | """ |
|
292 | """ | |
293 | GET /admin/settings/setting_id: Show a specific item""" |
|
293 | GET /admin/settings/setting_id: Show a specific item""" | |
294 | # url('admin_setting', setting_id=ID) |
|
294 | # url('admin_setting', setting_id=ID) | |
295 |
|
295 | |||
296 | @HasPermissionAllDecorator('hg.admin') |
|
296 | @HasPermissionAllDecorator('hg.admin') | |
297 | def edit(self, setting_id, format='html'): |
|
297 | def edit(self, setting_id, format='html'): | |
298 | """ |
|
298 | """ | |
299 | GET /admin/settings/setting_id/edit: Form to |
|
299 | GET /admin/settings/setting_id/edit: Form to | |
300 | edit an existing item""" |
|
300 | edit an existing item""" | |
301 | # url('admin_edit_setting', setting_id=ID) |
|
301 | # url('admin_edit_setting', setting_id=ID) | |
302 |
|
302 | |||
303 | c.hooks = RhodeCodeUi.get_builtin_hooks() |
|
303 | c.hooks = RhodeCodeUi.get_builtin_hooks() | |
304 | c.custom_hooks = RhodeCodeUi.get_custom_hooks() |
|
304 | c.custom_hooks = RhodeCodeUi.get_custom_hooks() | |
305 |
|
305 | |||
306 | return htmlfill.render( |
|
306 | return htmlfill.render( | |
307 | render('admin/settings/hooks.html'), |
|
307 | render('admin/settings/hooks.html'), | |
308 | defaults={}, |
|
308 | defaults={}, | |
309 | encoding="UTF-8", |
|
309 | encoding="UTF-8", | |
310 | force_defaults=False |
|
310 | force_defaults=False | |
311 | ) |
|
311 | ) | |
312 |
|
312 | |||
313 | @NotAnonymous() |
|
313 | @NotAnonymous() | |
314 | def my_account(self): |
|
314 | def my_account(self): | |
315 | """ |
|
315 | """ | |
316 | GET /_admin/my_account Displays info about my account |
|
316 | GET /_admin/my_account Displays info about my account | |
317 | """ |
|
317 | """ | |
318 | # url('admin_settings_my_account') |
|
318 | # url('admin_settings_my_account') | |
319 |
|
319 | |||
320 | c.user = User.get(self.rhodecode_user.user_id) |
|
320 | c.user = User.get(self.rhodecode_user.user_id) | |
321 | all_repos = self.sa.query(Repository)\ |
|
321 | all_repos = self.sa.query(Repository)\ | |
322 | .filter(Repository.user_id == c.user.user_id)\ |
|
322 | .filter(Repository.user_id == c.user.user_id)\ | |
323 | .order_by(func.lower(Repository.repo_name)).all() |
|
323 | .order_by(func.lower(Repository.repo_name)).all() | |
324 |
|
324 | |||
325 | c.user_repos = ScmModel().get_repos(all_repos) |
|
325 | c.user_repos = ScmModel().get_repos(all_repos) | |
326 |
|
326 | |||
327 | if c.user.username == 'default': |
|
327 | if c.user.username == 'default': | |
328 | h.flash(_("You can't edit this user since it's" |
|
328 | h.flash(_("You can't edit this user since it's" | |
329 | " crucial for entire application"), category='warning') |
|
329 | " crucial for entire application"), category='warning') | |
330 | return redirect(url('users')) |
|
330 | return redirect(url('users')) | |
331 |
|
331 | |||
332 | defaults = c.user.get_dict() |
|
332 | defaults = c.user.get_dict() | |
333 | return htmlfill.render( |
|
333 | return htmlfill.render( | |
334 | render('admin/users/user_edit_my_account.html'), |
|
334 | render('admin/users/user_edit_my_account.html'), | |
335 | defaults=defaults, |
|
335 | defaults=defaults, | |
336 | encoding="UTF-8", |
|
336 | encoding="UTF-8", | |
337 | force_defaults=False |
|
337 | force_defaults=False | |
338 | ) |
|
338 | ) | |
339 |
|
339 | |||
340 | def my_account_update(self): |
|
340 | def my_account_update(self): | |
341 | """PUT /_admin/my_account_update: Update an existing item""" |
|
341 | """PUT /_admin/my_account_update: Update an existing item""" | |
342 | # Forms posted to this method should contain a hidden field: |
|
342 | # Forms posted to this method should contain a hidden field: | |
343 | # <input type="hidden" name="_method" value="PUT" /> |
|
343 | # <input type="hidden" name="_method" value="PUT" /> | |
344 | # Or using helpers: |
|
344 | # Or using helpers: | |
345 | # h.form(url('admin_settings_my_account_update'), |
|
345 | # h.form(url('admin_settings_my_account_update'), | |
346 | # method='put') |
|
346 | # method='put') | |
347 | # url('admin_settings_my_account_update', id=ID) |
|
347 | # url('admin_settings_my_account_update', id=ID) | |
348 | user_model = UserModel() |
|
348 | user_model = UserModel() | |
349 | uid = self.rhodecode_user.user_id |
|
349 | uid = self.rhodecode_user.user_id | |
350 | _form = UserForm(edit=True, |
|
350 | _form = UserForm(edit=True, | |
351 | old_data={'user_id': uid, |
|
351 | old_data={'user_id': uid, | |
352 | 'email': self.rhodecode_user.email})() |
|
352 | 'email': self.rhodecode_user.email})() | |
353 | form_result = {} |
|
353 | form_result = {} | |
354 | try: |
|
354 | try: | |
355 | form_result = _form.to_python(dict(request.POST)) |
|
355 | form_result = _form.to_python(dict(request.POST)) | |
356 | user_model.update_my_account(uid, form_result) |
|
356 | user_model.update_my_account(uid, form_result) | |
357 | h.flash(_('Your account was updated successfully'), |
|
357 | h.flash(_('Your account was updated successfully'), | |
358 | category='success') |
|
358 | category='success') | |
359 | Session.commit() |
|
359 | Session.commit() | |
360 | except formencode.Invalid, errors: |
|
360 | except formencode.Invalid, errors: | |
361 | c.user = User.get(self.rhodecode_user.user_id) |
|
361 | c.user = User.get(self.rhodecode_user.user_id) | |
362 | all_repos = self.sa.query(Repository)\ |
|
362 | all_repos = self.sa.query(Repository)\ | |
363 | .filter(Repository.user_id == c.user.user_id)\ |
|
363 | .filter(Repository.user_id == c.user.user_id)\ | |
364 | .order_by(func.lower(Repository.repo_name))\ |
|
364 | .order_by(func.lower(Repository.repo_name))\ | |
365 | .all() |
|
365 | .all() | |
366 | c.user_repos = ScmModel().get_repos(all_repos) |
|
366 | c.user_repos = ScmModel().get_repos(all_repos) | |
367 |
|
367 | |||
368 | return htmlfill.render( |
|
368 | return htmlfill.render( | |
369 | render('admin/users/user_edit_my_account.html'), |
|
369 | render('admin/users/user_edit_my_account.html'), | |
370 | defaults=errors.value, |
|
370 | defaults=errors.value, | |
371 | errors=errors.error_dict or {}, |
|
371 | errors=errors.error_dict or {}, | |
372 | prefix_error=False, |
|
372 | prefix_error=False, | |
373 | encoding="UTF-8") |
|
373 | encoding="UTF-8") | |
374 | except Exception: |
|
374 | except Exception: | |
375 | log.error(traceback.format_exc()) |
|
375 | log.error(traceback.format_exc()) | |
376 | h.flash(_('error occurred during update of user %s') \ |
|
376 | h.flash(_('error occurred during update of user %s') \ | |
377 | % form_result.get('username'), category='error') |
|
377 | % form_result.get('username'), category='error') | |
378 |
|
378 | |||
379 | return redirect(url('my_account')) |
|
379 | return redirect(url('my_account')) | |
380 |
|
380 | |||
381 | @NotAnonymous() |
|
381 | @NotAnonymous() | |
382 | @HasPermissionAnyDecorator('hg.admin', 'hg.create.repository') |
|
382 | @HasPermissionAnyDecorator('hg.admin', 'hg.create.repository') | |
383 | def create_repository(self): |
|
383 | def create_repository(self): | |
384 | """GET /_admin/create_repository: Form to create a new item""" |
|
384 | """GET /_admin/create_repository: Form to create a new item""" | |
385 |
|
385 | |||
386 | c.repo_groups = RepoGroup.groups_choices() |
|
386 | c.repo_groups = RepoGroup.groups_choices() | |
387 | c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups) |
|
387 | c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups) | |
388 |
|
388 | |||
389 | new_repo = request.GET.get('repo', '') |
|
389 | new_repo = request.GET.get('repo', '') | |
390 | c.new_repo = repo_name_slug(new_repo) |
|
390 | c.new_repo = repo_name_slug(new_repo) | |
391 |
|
391 | |||
392 | return render('admin/repos/repo_add_create_repository.html') |
|
392 | return render('admin/repos/repo_add_create_repository.html') | |
393 |
|
393 | |||
394 | def get_hg_ui_settings(self): |
|
394 | def get_hg_ui_settings(self): | |
395 | ret = self.sa.query(RhodeCodeUi).all() |
|
395 | ret = self.sa.query(RhodeCodeUi).all() | |
396 |
|
396 | |||
397 | if not ret: |
|
397 | if not ret: | |
398 | raise Exception('Could not get application ui settings !') |
|
398 | raise Exception('Could not get application ui settings !') | |
399 | settings = {} |
|
399 | settings = {} | |
400 | for each in ret: |
|
400 | for each in ret: | |
401 | k = each.ui_key |
|
401 | k = each.ui_key | |
402 | v = each.ui_value |
|
402 | v = each.ui_value | |
403 | if k == '/': |
|
403 | if k == '/': | |
404 | k = 'root_path' |
|
404 | k = 'root_path' | |
405 |
|
405 | |||
406 | if k.find('.') != -1: |
|
406 | if k.find('.') != -1: | |
407 | k = k.replace('.', '_') |
|
407 | k = k.replace('.', '_') | |
408 |
|
408 | |||
409 | if each.ui_section == 'hooks': |
|
409 | if each.ui_section == 'hooks': | |
410 | v = each.ui_active |
|
410 | v = each.ui_active | |
411 |
|
411 | |||
412 | settings[each.ui_section + '_' + k] = v |
|
412 | settings[each.ui_section + '_' + k] = v | |
413 |
|
413 | |||
414 | return settings |
|
414 | return settings |
@@ -1,367 +1,373 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | """ |
|
2 | """ | |
3 | rhodecode.controllers.changeset |
|
3 | rhodecode.controllers.changeset | |
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
5 |
|
5 | |||
6 | changeset controller for pylons showoing changes beetween |
|
6 | changeset controller for pylons showoing changes beetween | |
7 | revisions |
|
7 | revisions | |
8 |
|
8 | |||
9 | :created_on: Apr 25, 2010 |
|
9 | :created_on: Apr 25, 2010 | |
10 | :author: marcink |
|
10 | :author: marcink | |
11 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> |
|
11 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> | |
12 | :license: GPLv3, see COPYING for more details. |
|
12 | :license: GPLv3, see COPYING for more details. | |
13 | """ |
|
13 | """ | |
14 | # This program is free software: you can redistribute it and/or modify |
|
14 | # This program is free software: you can redistribute it and/or modify | |
15 | # it under the terms of the GNU General Public License as published by |
|
15 | # it under the terms of the GNU General Public License as published by | |
16 | # the Free Software Foundation, either version 3 of the License, or |
|
16 | # the Free Software Foundation, either version 3 of the License, or | |
17 | # (at your option) any later version. |
|
17 | # (at your option) any later version. | |
18 | # |
|
18 | # | |
19 | # This program is distributed in the hope that it will be useful, |
|
19 | # This program is distributed in the hope that it will be useful, | |
20 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
20 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
21 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
21 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
22 | # GNU General Public License for more details. |
|
22 | # GNU General Public License for more details. | |
23 | # |
|
23 | # | |
24 | # You should have received a copy of the GNU General Public License |
|
24 | # You should have received a copy of the GNU General Public License | |
25 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
25 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
26 | import logging |
|
26 | import logging | |
27 | import traceback |
|
27 | import traceback | |
28 | from collections import defaultdict |
|
28 | from collections import defaultdict | |
29 | from webob.exc import HTTPForbidden |
|
29 | from webob.exc import HTTPForbidden | |
30 |
|
30 | |||
31 | from pylons import tmpl_context as c, url, request, response |
|
31 | from pylons import tmpl_context as c, url, request, response | |
32 | from pylons.i18n.translation import _ |
|
32 | from pylons.i18n.translation import _ | |
33 | from pylons.controllers.util import redirect |
|
33 | from pylons.controllers.util import redirect | |
34 | from pylons.decorators import jsonify |
|
34 | from pylons.decorators import jsonify | |
35 |
|
35 | |||
36 | from rhodecode.lib.vcs.exceptions import RepositoryError, ChangesetError, \ |
|
36 | from rhodecode.lib.vcs.exceptions import RepositoryError, ChangesetError, \ | |
37 | ChangesetDoesNotExistError |
|
37 | ChangesetDoesNotExistError | |
38 | from rhodecode.lib.vcs.nodes import FileNode |
|
38 | from rhodecode.lib.vcs.nodes import FileNode | |
39 |
|
39 | |||
40 | import rhodecode.lib.helpers as h |
|
40 | import rhodecode.lib.helpers as h | |
41 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator |
|
41 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator | |
42 | from rhodecode.lib.base import BaseRepoController, render |
|
42 | from rhodecode.lib.base import BaseRepoController, render | |
43 | from rhodecode.lib.utils import EmptyChangeset |
|
43 | from rhodecode.lib.utils import EmptyChangeset | |
44 | from rhodecode.lib.compat import OrderedDict |
|
44 | from rhodecode.lib.compat import OrderedDict | |
45 | from rhodecode.lib import diffs |
|
45 | from rhodecode.lib import diffs | |
46 | from rhodecode.model.db import ChangesetComment |
|
46 | from rhodecode.model.db import ChangesetComment | |
47 | from rhodecode.model.comment import ChangesetCommentsModel |
|
47 | from rhodecode.model.comment import ChangesetCommentsModel | |
48 | from rhodecode.model.meta import Session |
|
48 | from rhodecode.model.meta import Session | |
49 | from rhodecode.lib.diffs import wrapped_diff |
|
49 | from rhodecode.lib.diffs import wrapped_diff | |
50 |
|
50 | |||
51 | log = logging.getLogger(__name__) |
|
51 | log = logging.getLogger(__name__) | |
52 |
|
52 | |||
53 |
|
53 | |||
54 | def anchor_url(revision, path): |
|
54 | def anchor_url(revision, path): | |
55 | fid = h.FID(revision, path) |
|
55 | fid = h.FID(revision, path) | |
56 | return h.url.current(anchor=fid, **dict(request.GET)) |
|
56 | return h.url.current(anchor=fid, **dict(request.GET)) | |
57 |
|
57 | |||
58 |
|
58 | |||
59 | def get_ignore_ws(fid, GET): |
|
59 | def get_ignore_ws(fid, GET): | |
60 | ig_ws_global = request.GET.get('ignorews') |
|
60 | ig_ws_global = request.GET.get('ignorews') | |
61 | ig_ws = filter(lambda k: k.startswith('WS'), GET.getall(fid)) |
|
61 | ig_ws = filter(lambda k: k.startswith('WS'), GET.getall(fid)) | |
62 | if ig_ws: |
|
62 | if ig_ws: | |
63 | try: |
|
63 | try: | |
64 | return int(ig_ws[0].split(':')[-1]) |
|
64 | return int(ig_ws[0].split(':')[-1]) | |
65 | except: |
|
65 | except: | |
66 | pass |
|
66 | pass | |
67 | return ig_ws_global |
|
67 | return ig_ws_global | |
68 |
|
68 | |||
69 |
|
69 | |||
70 | def _ignorews_url(fileid=None): |
|
70 | def _ignorews_url(fileid=None): | |
71 |
|
71 | |||
72 | params = defaultdict(list) |
|
72 | params = defaultdict(list) | |
73 | lbl = _('show white space') |
|
73 | lbl = _('show white space') | |
74 | ig_ws = get_ignore_ws(fileid, request.GET) |
|
74 | ig_ws = get_ignore_ws(fileid, request.GET) | |
75 | ln_ctx = get_line_ctx(fileid, request.GET) |
|
75 | ln_ctx = get_line_ctx(fileid, request.GET) | |
76 | # global option |
|
76 | # global option | |
77 | if fileid is None: |
|
77 | if fileid is None: | |
78 | if ig_ws is None: |
|
78 | if ig_ws is None: | |
79 | params['ignorews'] += [1] |
|
79 | params['ignorews'] += [1] | |
80 | lbl = _('ignore white space') |
|
80 | lbl = _('ignore white space') | |
81 | ctx_key = 'context' |
|
81 | ctx_key = 'context' | |
82 | ctx_val = ln_ctx |
|
82 | ctx_val = ln_ctx | |
83 | # per file options |
|
83 | # per file options | |
84 | else: |
|
84 | else: | |
85 | if ig_ws is None: |
|
85 | if ig_ws is None: | |
86 | params[fileid] += ['WS:1'] |
|
86 | params[fileid] += ['WS:1'] | |
87 | lbl = _('ignore white space') |
|
87 | lbl = _('ignore white space') | |
88 |
|
88 | |||
89 | ctx_key = fileid |
|
89 | ctx_key = fileid | |
90 | ctx_val = 'C:%s' % ln_ctx |
|
90 | ctx_val = 'C:%s' % ln_ctx | |
91 | # if we have passed in ln_ctx pass it along to our params |
|
91 | # if we have passed in ln_ctx pass it along to our params | |
92 | if ln_ctx: |
|
92 | if ln_ctx: | |
93 | params[ctx_key] += [ctx_val] |
|
93 | params[ctx_key] += [ctx_val] | |
94 |
|
94 | |||
95 | params['anchor'] = fileid |
|
95 | params['anchor'] = fileid | |
96 | img = h.image(h.url('/images/icons/text_strikethrough.png'), lbl, class_='icon') |
|
96 | img = h.image(h.url('/images/icons/text_strikethrough.png'), lbl, class_='icon') | |
97 | return h.link_to(img, h.url.current(**params), title=lbl, class_='tooltip') |
|
97 | return h.link_to(img, h.url.current(**params), title=lbl, class_='tooltip') | |
98 |
|
98 | |||
99 |
|
99 | |||
100 | def get_line_ctx(fid, GET): |
|
100 | def get_line_ctx(fid, GET): | |
101 | ln_ctx_global = request.GET.get('context') |
|
101 | ln_ctx_global = request.GET.get('context') | |
102 | ln_ctx = filter(lambda k: k.startswith('C'), GET.getall(fid)) |
|
102 | ln_ctx = filter(lambda k: k.startswith('C'), GET.getall(fid)) | |
103 |
|
103 | |||
104 | if ln_ctx: |
|
104 | if ln_ctx: | |
105 | retval = ln_ctx[0].split(':')[-1] |
|
105 | retval = ln_ctx[0].split(':')[-1] | |
106 | else: |
|
106 | else: | |
107 | retval = ln_ctx_global |
|
107 | retval = ln_ctx_global | |
108 |
|
108 | |||
109 | try: |
|
109 | try: | |
110 | return int(retval) |
|
110 | return int(retval) | |
111 | except: |
|
111 | except: | |
112 | return |
|
112 | return | |
113 |
|
113 | |||
114 |
|
114 | |||
115 | def _context_url(fileid=None): |
|
115 | def _context_url(fileid=None): | |
116 | """ |
|
116 | """ | |
117 | Generates url for context lines |
|
117 | Generates url for context lines | |
118 |
|
118 | |||
119 | :param fileid: |
|
119 | :param fileid: | |
120 | """ |
|
120 | """ | |
121 | ig_ws = get_ignore_ws(fileid, request.GET) |
|
121 | ig_ws = get_ignore_ws(fileid, request.GET) | |
122 | ln_ctx = (get_line_ctx(fileid, request.GET) or 3) * 2 |
|
122 | ln_ctx = (get_line_ctx(fileid, request.GET) or 3) * 2 | |
123 |
|
123 | |||
124 | params = defaultdict(list) |
|
124 | params = defaultdict(list) | |
125 |
|
125 | |||
126 | # global option |
|
126 | # global option | |
127 | if fileid is None: |
|
127 | if fileid is None: | |
128 | if ln_ctx > 0: |
|
128 | if ln_ctx > 0: | |
129 | params['context'] += [ln_ctx] |
|
129 | params['context'] += [ln_ctx] | |
130 |
|
130 | |||
131 | if ig_ws: |
|
131 | if ig_ws: | |
132 | ig_ws_key = 'ignorews' |
|
132 | ig_ws_key = 'ignorews' | |
133 | ig_ws_val = 1 |
|
133 | ig_ws_val = 1 | |
134 |
|
134 | |||
135 | # per file option |
|
135 | # per file option | |
136 | else: |
|
136 | else: | |
137 | params[fileid] += ['C:%s' % ln_ctx] |
|
137 | params[fileid] += ['C:%s' % ln_ctx] | |
138 | ig_ws_key = fileid |
|
138 | ig_ws_key = fileid | |
139 | ig_ws_val = 'WS:%s' % 1 |
|
139 | ig_ws_val = 'WS:%s' % 1 | |
140 |
|
140 | |||
141 | if ig_ws: |
|
141 | if ig_ws: | |
142 | params[ig_ws_key] += [ig_ws_val] |
|
142 | params[ig_ws_key] += [ig_ws_val] | |
143 |
|
143 | |||
144 | lbl = _('%s line context') % ln_ctx |
|
144 | lbl = _('%s line context') % ln_ctx | |
145 |
|
145 | |||
146 | params['anchor'] = fileid |
|
146 | params['anchor'] = fileid | |
147 | img = h.image(h.url('/images/icons/table_add.png'), lbl, class_='icon') |
|
147 | img = h.image(h.url('/images/icons/table_add.png'), lbl, class_='icon') | |
148 | return h.link_to(img, h.url.current(**params), title=lbl, class_='tooltip') |
|
148 | return h.link_to(img, h.url.current(**params), title=lbl, class_='tooltip') | |
149 |
|
149 | |||
150 |
|
150 | |||
151 | class ChangesetController(BaseRepoController): |
|
151 | class ChangesetController(BaseRepoController): | |
152 |
|
152 | |||
153 | @LoginRequired() |
|
153 | @LoginRequired() | |
154 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', |
|
154 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', | |
155 | 'repository.admin') |
|
155 | 'repository.admin') | |
156 | def __before__(self): |
|
156 | def __before__(self): | |
157 | super(ChangesetController, self).__before__() |
|
157 | super(ChangesetController, self).__before__() | |
158 | c.affected_files_cut_off = 60 |
|
158 | c.affected_files_cut_off = 60 | |
159 |
|
159 | |||
160 | def index(self, revision): |
|
160 | def index(self, revision): | |
161 |
|
161 | |||
162 | c.anchor_url = anchor_url |
|
162 | c.anchor_url = anchor_url | |
163 | c.ignorews_url = _ignorews_url |
|
163 | c.ignorews_url = _ignorews_url | |
164 | c.context_url = _context_url |
|
164 | c.context_url = _context_url | |
165 |
|
165 | |||
166 | #get ranges of revisions if preset |
|
166 | #get ranges of revisions if preset | |
167 | rev_range = revision.split('...')[:2] |
|
167 | rev_range = revision.split('...')[:2] | |
168 | enable_comments = True |
|
168 | enable_comments = True | |
169 | try: |
|
169 | try: | |
170 | if len(rev_range) == 2: |
|
170 | if len(rev_range) == 2: | |
171 | enable_comments = False |
|
171 | enable_comments = False | |
172 | rev_start = rev_range[0] |
|
172 | rev_start = rev_range[0] | |
173 | rev_end = rev_range[1] |
|
173 | rev_end = rev_range[1] | |
174 | rev_ranges = c.rhodecode_repo.get_changesets(start=rev_start, |
|
174 | rev_ranges = c.rhodecode_repo.get_changesets(start=rev_start, | |
175 | end=rev_end) |
|
175 | end=rev_end) | |
176 | else: |
|
176 | else: | |
177 | rev_ranges = [c.rhodecode_repo.get_changeset(revision)] |
|
177 | rev_ranges = [c.rhodecode_repo.get_changeset(revision)] | |
178 |
|
178 | |||
179 | c.cs_ranges = list(rev_ranges) |
|
179 | c.cs_ranges = list(rev_ranges) | |
180 | if not c.cs_ranges: |
|
180 | if not c.cs_ranges: | |
181 | raise RepositoryError('Changeset range returned empty result') |
|
181 | raise RepositoryError('Changeset range returned empty result') | |
182 |
|
182 | |||
183 | except (RepositoryError, ChangesetDoesNotExistError, Exception), e: |
|
183 | except (RepositoryError, ChangesetDoesNotExistError, Exception), e: | |
184 | log.error(traceback.format_exc()) |
|
184 | log.error(traceback.format_exc()) | |
185 | h.flash(str(e), category='warning') |
|
185 | h.flash(str(e), category='warning') | |
186 | return redirect(url('home')) |
|
186 | return redirect(url('home')) | |
187 |
|
187 | |||
188 | c.changes = OrderedDict() |
|
188 | c.changes = OrderedDict() | |
189 |
|
189 | |||
190 | c.lines_added = 0 # count of lines added |
|
190 | c.lines_added = 0 # count of lines added | |
191 | c.lines_deleted = 0 # count of lines removes |
|
191 | c.lines_deleted = 0 # count of lines removes | |
192 |
|
192 | |||
193 | cumulative_diff = 0 |
|
193 | cumulative_diff = 0 | |
194 | c.cut_off = False # defines if cut off limit is reached |
|
194 | c.cut_off = False # defines if cut off limit is reached | |
195 |
|
195 | |||
196 | c.comments = [] |
|
196 | c.comments = [] | |
197 | c.inline_comments = [] |
|
197 | c.inline_comments = [] | |
198 | c.inline_cnt = 0 |
|
198 | c.inline_cnt = 0 | |
199 | # Iterate over ranges (default changeset view is always one changeset) |
|
199 | # Iterate over ranges (default changeset view is always one changeset) | |
200 | for changeset in c.cs_ranges: |
|
200 | for changeset in c.cs_ranges: | |
201 | c.comments.extend(ChangesetCommentsModel()\ |
|
201 | c.comments.extend(ChangesetCommentsModel()\ | |
202 | .get_comments(c.rhodecode_db_repo.repo_id, |
|
202 | .get_comments(c.rhodecode_db_repo.repo_id, | |
203 | changeset.raw_id)) |
|
203 | changeset.raw_id)) | |
204 | inlines = ChangesetCommentsModel()\ |
|
204 | inlines = ChangesetCommentsModel()\ | |
205 | .get_inline_comments(c.rhodecode_db_repo.repo_id, |
|
205 | .get_inline_comments(c.rhodecode_db_repo.repo_id, | |
206 | changeset.raw_id) |
|
206 | changeset.raw_id) | |
207 | c.inline_comments.extend(inlines) |
|
207 | c.inline_comments.extend(inlines) | |
208 | c.changes[changeset.raw_id] = [] |
|
208 | c.changes[changeset.raw_id] = [] | |
209 | try: |
|
209 | try: | |
210 | changeset_parent = changeset.parents[0] |
|
210 | changeset_parent = changeset.parents[0] | |
211 | except IndexError: |
|
211 | except IndexError: | |
212 | changeset_parent = None |
|
212 | changeset_parent = None | |
213 |
|
213 | |||
214 | #================================================================== |
|
214 | #================================================================== | |
215 | # ADDED FILES |
|
215 | # ADDED FILES | |
216 | #================================================================== |
|
216 | #================================================================== | |
217 | for node in changeset.added: |
|
217 | for node in changeset.added: | |
218 | fid = h.FID(revision, node.path) |
|
218 | fid = h.FID(revision, node.path) | |
219 | line_context_lcl = get_line_ctx(fid, request.GET) |
|
219 | line_context_lcl = get_line_ctx(fid, request.GET) | |
220 | ign_whitespace_lcl = get_ignore_ws(fid, request.GET) |
|
220 | ign_whitespace_lcl = get_ignore_ws(fid, request.GET) | |
221 | lim = self.cut_off_limit |
|
221 | lim = self.cut_off_limit | |
222 | if cumulative_diff > self.cut_off_limit: |
|
222 | if cumulative_diff > self.cut_off_limit: | |
223 | lim = -1 |
|
223 | lim = -1 | |
224 |
size, cs1, cs2, diff, st = wrapped_diff( |
|
224 | size, cs1, cs2, diff, st = wrapped_diff( | |
|
225 | filenode_old=None, | |||
225 |
|
|
226 | filenode_new=node, | |
226 |
|
|
227 | cut_off_limit=lim, | |
227 |
|
|
228 | ignore_whitespace=ign_whitespace_lcl, | |
228 |
|
|
229 | line_context=line_context_lcl, | |
229 |
|
|
230 | enable_comments=enable_comments | |
|
231 | ) | |||
230 | cumulative_diff += size |
|
232 | cumulative_diff += size | |
231 | c.lines_added += st[0] |
|
233 | c.lines_added += st[0] | |
232 | c.lines_deleted += st[1] |
|
234 | c.lines_deleted += st[1] | |
233 |
c.changes[changeset.raw_id].append( |
|
235 | c.changes[changeset.raw_id].append( | |
234 | cs1, cs2, st)) |
|
236 | ('added', node, diff, cs1, cs2, st) | |
|
237 | ) | |||
235 |
|
238 | |||
236 | #================================================================== |
|
239 | #================================================================== | |
237 | # CHANGED FILES |
|
240 | # CHANGED FILES | |
238 | #================================================================== |
|
241 | #================================================================== | |
239 | for node in changeset.changed: |
|
242 | for node in changeset.changed: | |
240 | try: |
|
243 | try: | |
241 | filenode_old = changeset_parent.get_node(node.path) |
|
244 | filenode_old = changeset_parent.get_node(node.path) | |
242 | except ChangesetError: |
|
245 | except ChangesetError: | |
243 | log.warning('Unable to fetch parent node for diff') |
|
246 | log.warning('Unable to fetch parent node for diff') | |
244 | filenode_old = FileNode(node.path, '', EmptyChangeset()) |
|
247 | filenode_old = FileNode(node.path, '', EmptyChangeset()) | |
245 |
|
248 | |||
246 | fid = h.FID(revision, node.path) |
|
249 | fid = h.FID(revision, node.path) | |
247 | line_context_lcl = get_line_ctx(fid, request.GET) |
|
250 | line_context_lcl = get_line_ctx(fid, request.GET) | |
248 | ign_whitespace_lcl = get_ignore_ws(fid, request.GET) |
|
251 | ign_whitespace_lcl = get_ignore_ws(fid, request.GET) | |
249 | lim = self.cut_off_limit |
|
252 | lim = self.cut_off_limit | |
250 | if cumulative_diff > self.cut_off_limit: |
|
253 | if cumulative_diff > self.cut_off_limit: | |
251 | lim = -1 |
|
254 | lim = -1 | |
252 |
size, cs1, cs2, diff, st = wrapped_diff( |
|
255 | size, cs1, cs2, diff, st = wrapped_diff( | |
|
256 | filenode_old=filenode_old, | |||
253 |
|
|
257 | filenode_new=node, | |
254 |
|
|
258 | cut_off_limit=lim, | |
255 |
|
|
259 | ignore_whitespace=ign_whitespace_lcl, | |
256 |
|
|
260 | line_context=line_context_lcl, | |
257 |
|
|
261 | enable_comments=enable_comments | |
|
262 | ) | |||
258 | cumulative_diff += size |
|
263 | cumulative_diff += size | |
259 | c.lines_added += st[0] |
|
264 | c.lines_added += st[0] | |
260 | c.lines_deleted += st[1] |
|
265 | c.lines_deleted += st[1] | |
261 |
c.changes[changeset.raw_id].append( |
|
266 | c.changes[changeset.raw_id].append( | |
262 | cs1, cs2, st)) |
|
267 | ('changed', node, diff, cs1, cs2, st) | |
263 |
|
268 | ) | ||
264 | #================================================================== |
|
269 | #================================================================== | |
265 | # REMOVED FILES |
|
270 | # REMOVED FILES | |
266 | #================================================================== |
|
271 | #================================================================== | |
267 | for node in changeset.removed: |
|
272 | for node in changeset.removed: | |
268 |
c.changes[changeset.raw_id].append( |
|
273 | c.changes[changeset.raw_id].append( | |
269 |
|
|
274 | ('removed', node, None, None, None, (0, 0)) | |
|
275 | ) | |||
270 |
|
276 | |||
271 | # count inline comments |
|
277 | # count inline comments | |
272 | for path, lines in c.inline_comments: |
|
278 | for path, lines in c.inline_comments: | |
273 | for comments in lines.values(): |
|
279 | for comments in lines.values(): | |
274 | c.inline_cnt += len(comments) |
|
280 | c.inline_cnt += len(comments) | |
275 |
|
281 | |||
276 | if len(c.cs_ranges) == 1: |
|
282 | if len(c.cs_ranges) == 1: | |
277 | c.changeset = c.cs_ranges[0] |
|
283 | c.changeset = c.cs_ranges[0] | |
278 | c.changes = c.changes[c.changeset.raw_id] |
|
284 | c.changes = c.changes[c.changeset.raw_id] | |
279 |
|
285 | |||
280 | return render('changeset/changeset.html') |
|
286 | return render('changeset/changeset.html') | |
281 | else: |
|
287 | else: | |
282 | return render('changeset/changeset_range.html') |
|
288 | return render('changeset/changeset_range.html') | |
283 |
|
289 | |||
284 | def raw_changeset(self, revision): |
|
290 | def raw_changeset(self, revision): | |
285 |
|
291 | |||
286 | method = request.GET.get('diff', 'show') |
|
292 | method = request.GET.get('diff', 'show') | |
287 | ignore_whitespace = request.GET.get('ignorews') == '1' |
|
293 | ignore_whitespace = request.GET.get('ignorews') == '1' | |
288 | line_context = request.GET.get('context', 3) |
|
294 | line_context = request.GET.get('context', 3) | |
289 | try: |
|
295 | try: | |
290 | c.scm_type = c.rhodecode_repo.alias |
|
296 | c.scm_type = c.rhodecode_repo.alias | |
291 | c.changeset = c.rhodecode_repo.get_changeset(revision) |
|
297 | c.changeset = c.rhodecode_repo.get_changeset(revision) | |
292 | except RepositoryError: |
|
298 | except RepositoryError: | |
293 | log.error(traceback.format_exc()) |
|
299 | log.error(traceback.format_exc()) | |
294 | return redirect(url('home')) |
|
300 | return redirect(url('home')) | |
295 | else: |
|
301 | else: | |
296 | try: |
|
302 | try: | |
297 | c.changeset_parent = c.changeset.parents[0] |
|
303 | c.changeset_parent = c.changeset.parents[0] | |
298 | except IndexError: |
|
304 | except IndexError: | |
299 | c.changeset_parent = None |
|
305 | c.changeset_parent = None | |
300 | c.changes = [] |
|
306 | c.changes = [] | |
301 |
|
307 | |||
302 | for node in c.changeset.added: |
|
308 | for node in c.changeset.added: | |
303 | filenode_old = FileNode(node.path, '') |
|
309 | filenode_old = FileNode(node.path, '') | |
304 | if filenode_old.is_binary or node.is_binary: |
|
310 | if filenode_old.is_binary or node.is_binary: | |
305 | diff = _('binary file') + '\n' |
|
311 | diff = _('binary file') + '\n' | |
306 | else: |
|
312 | else: | |
307 | f_gitdiff = diffs.get_gitdiff(filenode_old, node, |
|
313 | f_gitdiff = diffs.get_gitdiff(filenode_old, node, | |
308 | ignore_whitespace=ignore_whitespace, |
|
314 | ignore_whitespace=ignore_whitespace, | |
309 | context=line_context) |
|
315 | context=line_context) | |
310 | diff = diffs.DiffProcessor(f_gitdiff, |
|
316 | diff = diffs.DiffProcessor(f_gitdiff, | |
311 | format='gitdiff').raw_diff() |
|
317 | format='gitdiff').raw_diff() | |
312 |
|
318 | |||
313 | cs1 = None |
|
319 | cs1 = None | |
314 |
cs2 = node. |
|
320 | cs2 = node.changeset.raw_id | |
315 | c.changes.append(('added', node, diff, cs1, cs2)) |
|
321 | c.changes.append(('added', node, diff, cs1, cs2)) | |
316 |
|
322 | |||
317 | for node in c.changeset.changed: |
|
323 | for node in c.changeset.changed: | |
318 | filenode_old = c.changeset_parent.get_node(node.path) |
|
324 | filenode_old = c.changeset_parent.get_node(node.path) | |
319 | if filenode_old.is_binary or node.is_binary: |
|
325 | if filenode_old.is_binary or node.is_binary: | |
320 | diff = _('binary file') |
|
326 | diff = _('binary file') | |
321 | else: |
|
327 | else: | |
322 | f_gitdiff = diffs.get_gitdiff(filenode_old, node, |
|
328 | f_gitdiff = diffs.get_gitdiff(filenode_old, node, | |
323 | ignore_whitespace=ignore_whitespace, |
|
329 | ignore_whitespace=ignore_whitespace, | |
324 | context=line_context) |
|
330 | context=line_context) | |
325 | diff = diffs.DiffProcessor(f_gitdiff, |
|
331 | diff = diffs.DiffProcessor(f_gitdiff, | |
326 | format='gitdiff').raw_diff() |
|
332 | format='gitdiff').raw_diff() | |
327 |
|
333 | |||
328 |
cs1 = filenode_old. |
|
334 | cs1 = filenode_old.changeset.raw_id | |
329 |
cs2 = node. |
|
335 | cs2 = node.changeset.raw_id | |
330 | c.changes.append(('changed', node, diff, cs1, cs2)) |
|
336 | c.changes.append(('changed', node, diff, cs1, cs2)) | |
331 |
|
337 | |||
332 | response.content_type = 'text/plain' |
|
338 | response.content_type = 'text/plain' | |
333 |
|
339 | |||
334 | if method == 'download': |
|
340 | if method == 'download': | |
335 | response.content_disposition = 'attachment; filename=%s.patch' \ |
|
341 | response.content_disposition = 'attachment; filename=%s.patch' \ | |
336 | % revision |
|
342 | % revision | |
337 |
|
343 | |||
338 |
c.parent_tmpl = ''.join(['# Parent %s\n' % x.raw_id |
|
344 | c.parent_tmpl = ''.join(['# Parent %s\n' % x.raw_id | |
339 |
|
|
345 | for x in c.changeset.parents]) | |
340 |
|
346 | |||
341 | c.diffs = '' |
|
347 | c.diffs = '' | |
342 | for x in c.changes: |
|
348 | for x in c.changes: | |
343 | c.diffs += x[2] |
|
349 | c.diffs += x[2] | |
344 |
|
350 | |||
345 | return render('changeset/raw_changeset.html') |
|
351 | return render('changeset/raw_changeset.html') | |
346 |
|
352 | |||
347 | def comment(self, repo_name, revision): |
|
353 | def comment(self, repo_name, revision): | |
348 | ChangesetCommentsModel().create(text=request.POST.get('text'), |
|
354 | ChangesetCommentsModel().create(text=request.POST.get('text'), | |
349 | repo_id=c.rhodecode_db_repo.repo_id, |
|
355 | repo_id=c.rhodecode_db_repo.repo_id, | |
350 | user_id=c.rhodecode_user.user_id, |
|
356 | user_id=c.rhodecode_user.user_id, | |
351 | revision=revision, |
|
357 | revision=revision, | |
352 | f_path=request.POST.get('f_path'), |
|
358 | f_path=request.POST.get('f_path'), | |
353 | line_no=request.POST.get('line')) |
|
359 | line_no=request.POST.get('line')) | |
354 | Session.commit() |
|
360 | Session.commit() | |
355 | return redirect(h.url('changeset_home', repo_name=repo_name, |
|
361 | return redirect(h.url('changeset_home', repo_name=repo_name, | |
356 | revision=revision)) |
|
362 | revision=revision)) | |
357 |
|
363 | |||
358 | @jsonify |
|
364 | @jsonify | |
359 | def delete_comment(self, repo_name, comment_id): |
|
365 | def delete_comment(self, repo_name, comment_id): | |
360 | co = ChangesetComment.get(comment_id) |
|
366 | co = ChangesetComment.get(comment_id) | |
361 | owner = lambda: co.author.user_id == c.rhodecode_user.user_id |
|
367 | owner = lambda: co.author.user_id == c.rhodecode_user.user_id | |
362 | if h.HasPermissionAny('hg.admin', 'repository.admin')() or owner: |
|
368 | if h.HasPermissionAny('hg.admin', 'repository.admin')() or owner: | |
363 | ChangesetCommentsModel().delete(comment=co) |
|
369 | ChangesetCommentsModel().delete(comment=co) | |
364 | Session.commit() |
|
370 | Session.commit() | |
365 | return True |
|
371 | return True | |
366 | else: |
|
372 | else: | |
367 | raise HTTPForbidden() |
|
373 | raise HTTPForbidden() |
@@ -1,493 +1,494 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | """ |
|
2 | """ | |
3 | rhodecode.controllers.files |
|
3 | rhodecode.controllers.files | |
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
5 |
|
5 | |||
6 | Files controller for RhodeCode |
|
6 | Files controller for RhodeCode | |
7 |
|
7 | |||
8 | :created_on: Apr 21, 2010 |
|
8 | :created_on: Apr 21, 2010 | |
9 | :author: marcink |
|
9 | :author: marcink | |
10 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> |
|
10 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> | |
11 | :license: GPLv3, see COPYING for more details. |
|
11 | :license: GPLv3, see COPYING for more details. | |
12 | """ |
|
12 | """ | |
13 | # This program is free software: you can redistribute it and/or modify |
|
13 | # This program is free software: you can redistribute it and/or modify | |
14 | # it under the terms of the GNU General Public License as published by |
|
14 | # it under the terms of the GNU General Public License as published by | |
15 | # the Free Software Foundation, either version 3 of the License, or |
|
15 | # the Free Software Foundation, either version 3 of the License, or | |
16 | # (at your option) any later version. |
|
16 | # (at your option) any later version. | |
17 | # |
|
17 | # | |
18 | # This program is distributed in the hope that it will be useful, |
|
18 | # This program is distributed in the hope that it will be useful, | |
19 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
19 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
20 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
20 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
21 | # GNU General Public License for more details. |
|
21 | # GNU General Public License for more details. | |
22 | # |
|
22 | # | |
23 | # You should have received a copy of the GNU General Public License |
|
23 | # You should have received a copy of the GNU General Public License | |
24 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
24 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
25 |
|
25 | |||
26 | import os |
|
26 | import os | |
27 | import logging |
|
27 | import logging | |
28 | import traceback |
|
28 | import traceback | |
29 |
|
29 | |||
30 | from pylons import request, response, tmpl_context as c, url |
|
30 | from pylons import request, response, tmpl_context as c, url | |
31 | from pylons.i18n.translation import _ |
|
31 | from pylons.i18n.translation import _ | |
32 | from pylons.controllers.util import redirect |
|
32 | from pylons.controllers.util import redirect | |
33 | from pylons.decorators import jsonify |
|
33 | from pylons.decorators import jsonify | |
34 |
|
34 | |||
35 | from rhodecode.lib.vcs.conf import settings |
|
35 | from rhodecode.lib.vcs.conf import settings | |
36 | from rhodecode.lib.vcs.exceptions import RepositoryError, ChangesetDoesNotExistError, \ |
|
36 | from rhodecode.lib.vcs.exceptions import RepositoryError, ChangesetDoesNotExistError, \ | |
37 | EmptyRepositoryError, ImproperArchiveTypeError, VCSError, \ |
|
37 | EmptyRepositoryError, ImproperArchiveTypeError, VCSError, \ | |
38 | NodeAlreadyExistsError |
|
38 | NodeAlreadyExistsError | |
39 | from rhodecode.lib.vcs.nodes import FileNode |
|
39 | from rhodecode.lib.vcs.nodes import FileNode | |
40 |
|
40 | |||
41 | from rhodecode.lib.compat import OrderedDict |
|
41 | from rhodecode.lib.compat import OrderedDict | |
42 | from rhodecode.lib import convert_line_endings, detect_mode, safe_str |
|
42 | from rhodecode.lib import convert_line_endings, detect_mode, safe_str | |
43 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator |
|
43 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator | |
44 | from rhodecode.lib.base import BaseRepoController, render |
|
44 | from rhodecode.lib.base import BaseRepoController, render | |
45 | from rhodecode.lib.utils import EmptyChangeset |
|
45 | from rhodecode.lib.utils import EmptyChangeset | |
46 | from rhodecode.lib import diffs |
|
46 | from rhodecode.lib import diffs | |
47 | import rhodecode.lib.helpers as h |
|
47 | import rhodecode.lib.helpers as h | |
48 | from rhodecode.model.repo import RepoModel |
|
48 | from rhodecode.model.repo import RepoModel | |
49 | from rhodecode.controllers.changeset import anchor_url, _ignorews_url,\ |
|
49 | from rhodecode.controllers.changeset import anchor_url, _ignorews_url,\ | |
50 | _context_url, get_line_ctx, get_ignore_ws |
|
50 | _context_url, get_line_ctx, get_ignore_ws | |
51 | from rhodecode.lib.diffs import wrapped_diff |
|
51 | from rhodecode.lib.diffs import wrapped_diff | |
52 | from rhodecode.model.scm import ScmModel |
|
52 | from rhodecode.model.scm import ScmModel | |
53 |
|
53 | |||
54 | log = logging.getLogger(__name__) |
|
54 | log = logging.getLogger(__name__) | |
55 |
|
55 | |||
56 |
|
56 | |||
57 | class FilesController(BaseRepoController): |
|
57 | class FilesController(BaseRepoController): | |
58 |
|
58 | |||
59 | @LoginRequired() |
|
59 | @LoginRequired() | |
60 | def __before__(self): |
|
60 | def __before__(self): | |
61 | super(FilesController, self).__before__() |
|
61 | super(FilesController, self).__before__() | |
62 | c.cut_off_limit = self.cut_off_limit |
|
62 | c.cut_off_limit = self.cut_off_limit | |
63 |
|
63 | |||
64 | def __get_cs_or_redirect(self, rev, repo_name, redirect_after=True): |
|
64 | def __get_cs_or_redirect(self, rev, repo_name, redirect_after=True): | |
65 | """ |
|
65 | """ | |
66 | Safe way to get changeset if error occur it redirects to tip with |
|
66 | Safe way to get changeset if error occur it redirects to tip with | |
67 | proper message |
|
67 | proper message | |
68 |
|
68 | |||
69 | :param rev: revision to fetch |
|
69 | :param rev: revision to fetch | |
70 | :param repo_name: repo name to redirect after |
|
70 | :param repo_name: repo name to redirect after | |
71 | """ |
|
71 | """ | |
72 |
|
72 | |||
73 | try: |
|
73 | try: | |
74 | return c.rhodecode_repo.get_changeset(rev) |
|
74 | return c.rhodecode_repo.get_changeset(rev) | |
75 | except EmptyRepositoryError, e: |
|
75 | except EmptyRepositoryError, e: | |
76 | if not redirect_after: |
|
76 | if not redirect_after: | |
77 | return None |
|
77 | return None | |
78 | url_ = url('files_add_home', |
|
78 | url_ = url('files_add_home', | |
79 | repo_name=c.repo_name, |
|
79 | repo_name=c.repo_name, | |
80 | revision=0, f_path='') |
|
80 | revision=0, f_path='') | |
81 | add_new = '<a href="%s">[%s]</a>' % (url_, _('add new')) |
|
81 | add_new = '<a href="%s">[%s]</a>' % (url_, _('add new')) | |
82 | h.flash(h.literal(_('There are no files yet %s' % add_new)), |
|
82 | h.flash(h.literal(_('There are no files yet %s' % add_new)), | |
83 | category='warning') |
|
83 | category='warning') | |
84 | redirect(h.url('summary_home', repo_name=repo_name)) |
|
84 | redirect(h.url('summary_home', repo_name=repo_name)) | |
85 |
|
85 | |||
86 | except RepositoryError, e: |
|
86 | except RepositoryError, e: | |
87 | h.flash(str(e), category='warning') |
|
87 | h.flash(str(e), category='warning') | |
88 | redirect(h.url('files_home', repo_name=repo_name, revision='tip')) |
|
88 | redirect(h.url('files_home', repo_name=repo_name, revision='tip')) | |
89 |
|
89 | |||
90 | def __get_filenode_or_redirect(self, repo_name, cs, path): |
|
90 | def __get_filenode_or_redirect(self, repo_name, cs, path): | |
91 | """ |
|
91 | """ | |
92 | Returns file_node, if error occurs or given path is directory, |
|
92 | Returns file_node, if error occurs or given path is directory, | |
93 | it'll redirect to top level path |
|
93 | it'll redirect to top level path | |
94 |
|
94 | |||
95 | :param repo_name: repo_name |
|
95 | :param repo_name: repo_name | |
96 | :param cs: given changeset |
|
96 | :param cs: given changeset | |
97 | :param path: path to lookup |
|
97 | :param path: path to lookup | |
98 | """ |
|
98 | """ | |
99 |
|
99 | |||
100 | try: |
|
100 | try: | |
101 | file_node = cs.get_node(path) |
|
101 | file_node = cs.get_node(path) | |
102 | if file_node.is_dir(): |
|
102 | if file_node.is_dir(): | |
103 | raise RepositoryError('given path is a directory') |
|
103 | raise RepositoryError('given path is a directory') | |
104 | except RepositoryError, e: |
|
104 | except RepositoryError, e: | |
105 | h.flash(str(e), category='warning') |
|
105 | h.flash(str(e), category='warning') | |
106 | redirect(h.url('files_home', repo_name=repo_name, |
|
106 | redirect(h.url('files_home', repo_name=repo_name, | |
107 | revision=cs.raw_id)) |
|
107 | revision=cs.raw_id)) | |
108 |
|
108 | |||
109 | return file_node |
|
109 | return file_node | |
110 |
|
110 | |||
111 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', |
|
111 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', | |
112 | 'repository.admin') |
|
112 | 'repository.admin') | |
113 | def index(self, repo_name, revision, f_path): |
|
113 | def index(self, repo_name, revision, f_path): | |
114 | # redirect to given revision from form if given |
|
114 | # redirect to given revision from form if given | |
115 | post_revision = request.POST.get('at_rev', None) |
|
115 | post_revision = request.POST.get('at_rev', None) | |
116 | if post_revision: |
|
116 | if post_revision: | |
117 | cs = self.__get_cs_or_redirect(post_revision, repo_name) |
|
117 | cs = self.__get_cs_or_redirect(post_revision, repo_name) | |
118 | redirect(url('files_home', repo_name=c.repo_name, |
|
118 | redirect(url('files_home', repo_name=c.repo_name, | |
119 | revision=cs.raw_id, f_path=f_path)) |
|
119 | revision=cs.raw_id, f_path=f_path)) | |
120 |
|
120 | |||
121 | c.changeset = self.__get_cs_or_redirect(revision, repo_name) |
|
121 | c.changeset = self.__get_cs_or_redirect(revision, repo_name) | |
122 | c.branch = request.GET.get('branch', None) |
|
122 | c.branch = request.GET.get('branch', None) | |
123 | c.f_path = f_path |
|
123 | c.f_path = f_path | |
124 |
|
124 | |||
125 | cur_rev = c.changeset.revision |
|
125 | cur_rev = c.changeset.revision | |
126 |
|
126 | |||
127 | # prev link |
|
127 | # prev link | |
128 | try: |
|
128 | try: | |
129 | prev_rev = c.rhodecode_repo.get_changeset(cur_rev).prev(c.branch) |
|
129 | prev_rev = c.rhodecode_repo.get_changeset(cur_rev).prev(c.branch) | |
130 | c.url_prev = url('files_home', repo_name=c.repo_name, |
|
130 | c.url_prev = url('files_home', repo_name=c.repo_name, | |
131 | revision=prev_rev.raw_id, f_path=f_path) |
|
131 | revision=prev_rev.raw_id, f_path=f_path) | |
132 | if c.branch: |
|
132 | if c.branch: | |
133 | c.url_prev += '?branch=%s' % c.branch |
|
133 | c.url_prev += '?branch=%s' % c.branch | |
134 | except (ChangesetDoesNotExistError, VCSError): |
|
134 | except (ChangesetDoesNotExistError, VCSError): | |
135 | c.url_prev = '#' |
|
135 | c.url_prev = '#' | |
136 |
|
136 | |||
137 | # next link |
|
137 | # next link | |
138 | try: |
|
138 | try: | |
139 | next_rev = c.rhodecode_repo.get_changeset(cur_rev).next(c.branch) |
|
139 | next_rev = c.rhodecode_repo.get_changeset(cur_rev).next(c.branch) | |
140 | c.url_next = url('files_home', repo_name=c.repo_name, |
|
140 | c.url_next = url('files_home', repo_name=c.repo_name, | |
141 | revision=next_rev.raw_id, f_path=f_path) |
|
141 | revision=next_rev.raw_id, f_path=f_path) | |
142 | if c.branch: |
|
142 | if c.branch: | |
143 | c.url_next += '?branch=%s' % c.branch |
|
143 | c.url_next += '?branch=%s' % c.branch | |
144 | except (ChangesetDoesNotExistError, VCSError): |
|
144 | except (ChangesetDoesNotExistError, VCSError): | |
145 | c.url_next = '#' |
|
145 | c.url_next = '#' | |
146 |
|
146 | |||
147 | # files or dirs |
|
147 | # files or dirs | |
148 | try: |
|
148 | try: | |
149 | c.file = c.changeset.get_node(f_path) |
|
149 | c.file = c.changeset.get_node(f_path) | |
150 |
|
150 | |||
151 | if c.file.is_file(): |
|
151 | if c.file.is_file(): | |
152 | c.file_history = self._get_node_history(c.changeset, f_path) |
|
152 | c.file_history = self._get_node_history(c.changeset, f_path) | |
153 | else: |
|
153 | else: | |
154 | c.file_history = [] |
|
154 | c.file_history = [] | |
155 | except RepositoryError, e: |
|
155 | except RepositoryError, e: | |
156 | h.flash(str(e), category='warning') |
|
156 | h.flash(str(e), category='warning') | |
157 | redirect(h.url('files_home', repo_name=repo_name, |
|
157 | redirect(h.url('files_home', repo_name=repo_name, | |
158 | revision=revision)) |
|
158 | revision=revision)) | |
159 |
|
159 | |||
160 | return render('files/files.html') |
|
160 | return render('files/files.html') | |
161 |
|
161 | |||
162 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', |
|
162 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', | |
163 | 'repository.admin') |
|
163 | 'repository.admin') | |
164 | def rawfile(self, repo_name, revision, f_path): |
|
164 | def rawfile(self, repo_name, revision, f_path): | |
165 | cs = self.__get_cs_or_redirect(revision, repo_name) |
|
165 | cs = self.__get_cs_or_redirect(revision, repo_name) | |
166 | file_node = self.__get_filenode_or_redirect(repo_name, cs, f_path) |
|
166 | file_node = self.__get_filenode_or_redirect(repo_name, cs, f_path) | |
167 |
|
167 | |||
168 | response.content_disposition = 'attachment; filename=%s' % \ |
|
168 | response.content_disposition = 'attachment; filename=%s' % \ | |
169 | safe_str(f_path.split(os.sep)[-1]) |
|
169 | safe_str(f_path.split(os.sep)[-1]) | |
170 |
|
170 | |||
171 | response.content_type = file_node.mimetype |
|
171 | response.content_type = file_node.mimetype | |
172 | return file_node.content |
|
172 | return file_node.content | |
173 |
|
173 | |||
174 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', |
|
174 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', | |
175 | 'repository.admin') |
|
175 | 'repository.admin') | |
176 | def raw(self, repo_name, revision, f_path): |
|
176 | def raw(self, repo_name, revision, f_path): | |
177 | cs = self.__get_cs_or_redirect(revision, repo_name) |
|
177 | cs = self.__get_cs_or_redirect(revision, repo_name) | |
178 | file_node = self.__get_filenode_or_redirect(repo_name, cs, f_path) |
|
178 | file_node = self.__get_filenode_or_redirect(repo_name, cs, f_path) | |
179 |
|
179 | |||
180 | raw_mimetype_mapping = { |
|
180 | raw_mimetype_mapping = { | |
181 | # map original mimetype to a mimetype used for "show as raw" |
|
181 | # map original mimetype to a mimetype used for "show as raw" | |
182 | # you can also provide a content-disposition to override the |
|
182 | # you can also provide a content-disposition to override the | |
183 | # default "attachment" disposition. |
|
183 | # default "attachment" disposition. | |
184 | # orig_type: (new_type, new_dispo) |
|
184 | # orig_type: (new_type, new_dispo) | |
185 |
|
185 | |||
186 | # show images inline: |
|
186 | # show images inline: | |
187 | 'image/x-icon': ('image/x-icon', 'inline'), |
|
187 | 'image/x-icon': ('image/x-icon', 'inline'), | |
188 | 'image/png': ('image/png', 'inline'), |
|
188 | 'image/png': ('image/png', 'inline'), | |
189 | 'image/gif': ('image/gif', 'inline'), |
|
189 | 'image/gif': ('image/gif', 'inline'), | |
190 | 'image/jpeg': ('image/jpeg', 'inline'), |
|
190 | 'image/jpeg': ('image/jpeg', 'inline'), | |
191 | 'image/svg+xml': ('image/svg+xml', 'inline'), |
|
191 | 'image/svg+xml': ('image/svg+xml', 'inline'), | |
192 | } |
|
192 | } | |
193 |
|
193 | |||
194 | mimetype = file_node.mimetype |
|
194 | mimetype = file_node.mimetype | |
195 | try: |
|
195 | try: | |
196 | mimetype, dispo = raw_mimetype_mapping[mimetype] |
|
196 | mimetype, dispo = raw_mimetype_mapping[mimetype] | |
197 | except KeyError: |
|
197 | except KeyError: | |
198 | # we don't know anything special about this, handle it safely |
|
198 | # we don't know anything special about this, handle it safely | |
199 | if file_node.is_binary: |
|
199 | if file_node.is_binary: | |
200 | # do same as download raw for binary files |
|
200 | # do same as download raw for binary files | |
201 | mimetype, dispo = 'application/octet-stream', 'attachment' |
|
201 | mimetype, dispo = 'application/octet-stream', 'attachment' | |
202 | else: |
|
202 | else: | |
203 | # do not just use the original mimetype, but force text/plain, |
|
203 | # do not just use the original mimetype, but force text/plain, | |
204 | # otherwise it would serve text/html and that might be unsafe. |
|
204 | # otherwise it would serve text/html and that might be unsafe. | |
205 | # Note: underlying vcs library fakes text/plain mimetype if the |
|
205 | # Note: underlying vcs library fakes text/plain mimetype if the | |
206 | # mimetype can not be determined and it thinks it is not |
|
206 | # mimetype can not be determined and it thinks it is not | |
207 | # binary.This might lead to erroneous text display in some |
|
207 | # binary.This might lead to erroneous text display in some | |
208 | # cases, but helps in other cases, like with text files |
|
208 | # cases, but helps in other cases, like with text files | |
209 | # without extension. |
|
209 | # without extension. | |
210 | mimetype, dispo = 'text/plain', 'inline' |
|
210 | mimetype, dispo = 'text/plain', 'inline' | |
211 |
|
211 | |||
212 | if dispo == 'attachment': |
|
212 | if dispo == 'attachment': | |
213 | dispo = 'attachment; filename=%s' % \ |
|
213 | dispo = 'attachment; filename=%s' % \ | |
214 | safe_str(f_path.split(os.sep)[-1]) |
|
214 | safe_str(f_path.split(os.sep)[-1]) | |
215 |
|
215 | |||
216 | response.content_disposition = dispo |
|
216 | response.content_disposition = dispo | |
217 | response.content_type = mimetype |
|
217 | response.content_type = mimetype | |
218 | return file_node.content |
|
218 | return file_node.content | |
219 |
|
219 | |||
220 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', |
|
220 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', | |
221 | 'repository.admin') |
|
221 | 'repository.admin') | |
222 | def annotate(self, repo_name, revision, f_path): |
|
222 | def annotate(self, repo_name, revision, f_path): | |
223 | c.cs = self.__get_cs_or_redirect(revision, repo_name) |
|
223 | c.cs = self.__get_cs_or_redirect(revision, repo_name) | |
224 | c.file = self.__get_filenode_or_redirect(repo_name, c.cs, f_path) |
|
224 | c.file = self.__get_filenode_or_redirect(repo_name, c.cs, f_path) | |
225 |
|
225 | |||
226 | c.file_history = self._get_node_history(c.cs, f_path) |
|
226 | c.file_history = self._get_node_history(c.cs, f_path) | |
227 | c.f_path = f_path |
|
227 | c.f_path = f_path | |
228 | return render('files/files_annotate.html') |
|
228 | return render('files/files_annotate.html') | |
229 |
|
229 | |||
230 | @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin') |
|
230 | @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin') | |
231 | def edit(self, repo_name, revision, f_path): |
|
231 | def edit(self, repo_name, revision, f_path): | |
232 | r_post = request.POST |
|
232 | r_post = request.POST | |
233 |
|
233 | |||
234 | c.cs = self.__get_cs_or_redirect(revision, repo_name) |
|
234 | c.cs = self.__get_cs_or_redirect(revision, repo_name) | |
235 | c.file = self.__get_filenode_or_redirect(repo_name, c.cs, f_path) |
|
235 | c.file = self.__get_filenode_or_redirect(repo_name, c.cs, f_path) | |
236 |
|
236 | |||
237 | if c.file.is_binary: |
|
237 | if c.file.is_binary: | |
238 | return redirect(url('files_home', repo_name=c.repo_name, |
|
238 | return redirect(url('files_home', repo_name=c.repo_name, | |
239 | revision=c.cs.raw_id, f_path=f_path)) |
|
239 | revision=c.cs.raw_id, f_path=f_path)) | |
240 |
|
240 | |||
241 | c.f_path = f_path |
|
241 | c.f_path = f_path | |
242 |
|
242 | |||
243 | if r_post: |
|
243 | if r_post: | |
244 |
|
244 | |||
245 | old_content = c.file.content |
|
245 | old_content = c.file.content | |
246 | sl = old_content.splitlines(1) |
|
246 | sl = old_content.splitlines(1) | |
247 | first_line = sl[0] if sl else '' |
|
247 | first_line = sl[0] if sl else '' | |
248 | # modes: 0 - Unix, 1 - Mac, 2 - DOS |
|
248 | # modes: 0 - Unix, 1 - Mac, 2 - DOS | |
249 | mode = detect_mode(first_line, 0) |
|
249 | mode = detect_mode(first_line, 0) | |
250 | content = convert_line_endings(r_post.get('content'), mode) |
|
250 | content = convert_line_endings(r_post.get('content'), mode) | |
251 |
|
251 | |||
252 | message = r_post.get('message') or (_('Edited %s via RhodeCode') |
|
252 | message = r_post.get('message') or (_('Edited %s via RhodeCode') | |
253 | % (f_path)) |
|
253 | % (f_path)) | |
254 | author = self.rhodecode_user.full_contact |
|
254 | author = self.rhodecode_user.full_contact | |
255 |
|
255 | |||
256 | if content == old_content: |
|
256 | if content == old_content: | |
257 | h.flash(_('No changes'), |
|
257 | h.flash(_('No changes'), | |
258 | category='warning') |
|
258 | category='warning') | |
259 | return redirect(url('changeset_home', repo_name=c.repo_name, |
|
259 | return redirect(url('changeset_home', repo_name=c.repo_name, | |
260 | revision='tip')) |
|
260 | revision='tip')) | |
261 |
|
261 | |||
262 | try: |
|
262 | try: | |
263 | self.scm_model.commit_change(repo=c.rhodecode_repo, |
|
263 | self.scm_model.commit_change(repo=c.rhodecode_repo, | |
264 | repo_name=repo_name, cs=c.cs, |
|
264 | repo_name=repo_name, cs=c.cs, | |
265 | user=self.rhodecode_user, |
|
265 | user=self.rhodecode_user, | |
266 | author=author, message=message, |
|
266 | author=author, message=message, | |
267 | content=content, f_path=f_path) |
|
267 | content=content, f_path=f_path) | |
268 | h.flash(_('Successfully committed to %s' % f_path), |
|
268 | h.flash(_('Successfully committed to %s' % f_path), | |
269 | category='success') |
|
269 | category='success') | |
270 |
|
270 | |||
271 | except Exception: |
|
271 | except Exception: | |
272 | log.error(traceback.format_exc()) |
|
272 | log.error(traceback.format_exc()) | |
273 | h.flash(_('Error occurred during commit'), category='error') |
|
273 | h.flash(_('Error occurred during commit'), category='error') | |
274 | return redirect(url('changeset_home', |
|
274 | return redirect(url('changeset_home', | |
275 | repo_name=c.repo_name, revision='tip')) |
|
275 | repo_name=c.repo_name, revision='tip')) | |
276 |
|
276 | |||
277 | return render('files/files_edit.html') |
|
277 | return render('files/files_edit.html') | |
278 |
|
278 | |||
279 | @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin') |
|
279 | @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin') | |
280 | def add(self, repo_name, revision, f_path): |
|
280 | def add(self, repo_name, revision, f_path): | |
281 | r_post = request.POST |
|
281 | r_post = request.POST | |
282 | c.cs = self.__get_cs_or_redirect(revision, repo_name, |
|
282 | c.cs = self.__get_cs_or_redirect(revision, repo_name, | |
283 | redirect_after=False) |
|
283 | redirect_after=False) | |
284 | if c.cs is None: |
|
284 | if c.cs is None: | |
285 | c.cs = EmptyChangeset(alias=c.rhodecode_repo.alias) |
|
285 | c.cs = EmptyChangeset(alias=c.rhodecode_repo.alias) | |
286 |
|
286 | |||
287 | c.f_path = f_path |
|
287 | c.f_path = f_path | |
288 |
|
288 | |||
289 | if r_post: |
|
289 | if r_post: | |
290 | unix_mode = 0 |
|
290 | unix_mode = 0 | |
291 | content = convert_line_endings(r_post.get('content'), unix_mode) |
|
291 | content = convert_line_endings(r_post.get('content'), unix_mode) | |
292 |
|
292 | |||
293 | message = r_post.get('message') or (_('Added %s via RhodeCode') |
|
293 | message = r_post.get('message') or (_('Added %s via RhodeCode') | |
294 | % (f_path)) |
|
294 | % (f_path)) | |
295 | location = r_post.get('location') |
|
295 | location = r_post.get('location') | |
296 | filename = r_post.get('filename') |
|
296 | filename = r_post.get('filename') | |
297 | file_obj = r_post.get('upload_file', None) |
|
297 | file_obj = r_post.get('upload_file', None) | |
298 |
|
298 | |||
299 | if file_obj is not None and hasattr(file_obj, 'filename'): |
|
299 | if file_obj is not None and hasattr(file_obj, 'filename'): | |
300 | filename = file_obj.filename |
|
300 | filename = file_obj.filename | |
301 | content = file_obj.file |
|
301 | content = file_obj.file | |
302 |
|
302 | |||
303 | node_path = os.path.join(location, filename) |
|
303 | node_path = os.path.join(location, filename) | |
304 | author = self.rhodecode_user.full_contact |
|
304 | author = self.rhodecode_user.full_contact | |
305 |
|
305 | |||
306 | if not content: |
|
306 | if not content: | |
307 | h.flash(_('No content'), category='warning') |
|
307 | h.flash(_('No content'), category='warning') | |
308 | return redirect(url('changeset_home', repo_name=c.repo_name, |
|
308 | return redirect(url('changeset_home', repo_name=c.repo_name, | |
309 | revision='tip')) |
|
309 | revision='tip')) | |
310 | if not filename: |
|
310 | if not filename: | |
311 | h.flash(_('No filename'), category='warning') |
|
311 | h.flash(_('No filename'), category='warning') | |
312 | return redirect(url('changeset_home', repo_name=c.repo_name, |
|
312 | return redirect(url('changeset_home', repo_name=c.repo_name, | |
313 | revision='tip')) |
|
313 | revision='tip')) | |
314 |
|
314 | |||
315 | try: |
|
315 | try: | |
316 | self.scm_model.create_node(repo=c.rhodecode_repo, |
|
316 | self.scm_model.create_node(repo=c.rhodecode_repo, | |
317 | repo_name=repo_name, cs=c.cs, |
|
317 | repo_name=repo_name, cs=c.cs, | |
318 | user=self.rhodecode_user, |
|
318 | user=self.rhodecode_user, | |
319 | author=author, message=message, |
|
319 | author=author, message=message, | |
320 | content=content, f_path=node_path) |
|
320 | content=content, f_path=node_path) | |
321 | h.flash(_('Successfully committed to %s' % node_path), |
|
321 | h.flash(_('Successfully committed to %s' % node_path), | |
322 | category='success') |
|
322 | category='success') | |
323 | except NodeAlreadyExistsError, e: |
|
323 | except NodeAlreadyExistsError, e: | |
324 | h.flash(_(e), category='error') |
|
324 | h.flash(_(e), category='error') | |
325 | except Exception: |
|
325 | except Exception: | |
326 | log.error(traceback.format_exc()) |
|
326 | log.error(traceback.format_exc()) | |
327 | h.flash(_('Error occurred during commit'), category='error') |
|
327 | h.flash(_('Error occurred during commit'), category='error') | |
328 | return redirect(url('changeset_home', |
|
328 | return redirect(url('changeset_home', | |
329 | repo_name=c.repo_name, revision='tip')) |
|
329 | repo_name=c.repo_name, revision='tip')) | |
330 |
|
330 | |||
331 | return render('files/files_add.html') |
|
331 | return render('files/files_add.html') | |
332 |
|
332 | |||
333 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', |
|
333 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', | |
334 | 'repository.admin') |
|
334 | 'repository.admin') | |
335 | def archivefile(self, repo_name, fname): |
|
335 | def archivefile(self, repo_name, fname): | |
336 |
|
336 | |||
337 | fileformat = None |
|
337 | fileformat = None | |
338 | revision = None |
|
338 | revision = None | |
339 | ext = None |
|
339 | ext = None | |
340 | subrepos = request.GET.get('subrepos') == 'true' |
|
340 | subrepos = request.GET.get('subrepos') == 'true' | |
341 |
|
341 | |||
342 | for a_type, ext_data in settings.ARCHIVE_SPECS.items(): |
|
342 | for a_type, ext_data in settings.ARCHIVE_SPECS.items(): | |
343 | archive_spec = fname.split(ext_data[1]) |
|
343 | archive_spec = fname.split(ext_data[1]) | |
344 | if len(archive_spec) == 2 and archive_spec[1] == '': |
|
344 | if len(archive_spec) == 2 and archive_spec[1] == '': | |
345 | fileformat = a_type or ext_data[1] |
|
345 | fileformat = a_type or ext_data[1] | |
346 | revision = archive_spec[0] |
|
346 | revision = archive_spec[0] | |
347 | ext = ext_data[1] |
|
347 | ext = ext_data[1] | |
348 |
|
348 | |||
349 | try: |
|
349 | try: | |
350 | dbrepo = RepoModel().get_by_repo_name(repo_name) |
|
350 | dbrepo = RepoModel().get_by_repo_name(repo_name) | |
351 | if dbrepo.enable_downloads is False: |
|
351 | if dbrepo.enable_downloads is False: | |
352 | return _('downloads disabled') |
|
352 | return _('downloads disabled') | |
353 |
|
353 | |||
354 | if c.rhodecode_repo.alias == 'hg': |
|
354 | if c.rhodecode_repo.alias == 'hg': | |
355 | # patch and reset hooks section of UI config to not run any |
|
355 | # patch and reset hooks section of UI config to not run any | |
356 | # hooks on fetching archives with subrepos |
|
356 | # hooks on fetching archives with subrepos | |
357 | for k, v in c.rhodecode_repo._repo.ui.configitems('hooks'): |
|
357 | for k, v in c.rhodecode_repo._repo.ui.configitems('hooks'): | |
358 | c.rhodecode_repo._repo.ui.setconfig('hooks', k, None) |
|
358 | c.rhodecode_repo._repo.ui.setconfig('hooks', k, None) | |
359 |
|
359 | |||
360 | cs = c.rhodecode_repo.get_changeset(revision) |
|
360 | cs = c.rhodecode_repo.get_changeset(revision) | |
361 | content_type = settings.ARCHIVE_SPECS[fileformat][0] |
|
361 | content_type = settings.ARCHIVE_SPECS[fileformat][0] | |
362 | except ChangesetDoesNotExistError: |
|
362 | except ChangesetDoesNotExistError: | |
363 | return _('Unknown revision %s') % revision |
|
363 | return _('Unknown revision %s') % revision | |
364 | except EmptyRepositoryError: |
|
364 | except EmptyRepositoryError: | |
365 | return _('Empty repository') |
|
365 | return _('Empty repository') | |
366 | except (ImproperArchiveTypeError, KeyError): |
|
366 | except (ImproperArchiveTypeError, KeyError): | |
367 | return _('Unknown archive type') |
|
367 | return _('Unknown archive type') | |
368 |
|
368 | |||
369 | response.content_type = content_type |
|
369 | response.content_type = content_type | |
370 | response.content_disposition = 'attachment; filename=%s-%s%s' \ |
|
370 | response.content_disposition = 'attachment; filename=%s-%s%s' \ | |
371 | % (repo_name, revision, ext) |
|
371 | % (repo_name, revision, ext) | |
372 |
|
372 | |||
373 | import tempfile |
|
373 | import tempfile | |
374 | archive = tempfile.mkstemp()[1] |
|
374 | archive = tempfile.mkstemp()[1] | |
375 | t = open(archive, 'wb') |
|
375 | t = open(archive, 'wb') | |
376 | cs.fill_archive(stream=t, kind=fileformat, subrepos=subrepos) |
|
376 | cs.fill_archive(stream=t, kind=fileformat, subrepos=subrepos) | |
377 |
|
377 | |||
378 | def get_chunked_archive(archive): |
|
378 | def get_chunked_archive(archive): | |
379 | stream = open(archive, 'rb') |
|
379 | stream = open(archive, 'rb') | |
380 | while True: |
|
380 | while True: | |
381 | data = stream.read(4096) |
|
381 | data = stream.read(4096) | |
382 | if not data: |
|
382 | if not data: | |
383 | os.remove(archive) |
|
383 | os.remove(archive) | |
384 | break |
|
384 | break | |
385 | yield data |
|
385 | yield data | |
386 |
|
386 | |||
387 | return get_chunked_archive(archive) |
|
387 | return get_chunked_archive(archive) | |
388 |
|
388 | |||
389 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', |
|
389 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', | |
390 | 'repository.admin') |
|
390 | 'repository.admin') | |
391 | def diff(self, repo_name, f_path): |
|
391 | def diff(self, repo_name, f_path): | |
392 | ignore_whitespace = request.GET.get('ignorews') == '1' |
|
392 | ignore_whitespace = request.GET.get('ignorews') == '1' | |
393 | line_context = request.GET.get('context', 3) |
|
393 | line_context = request.GET.get('context', 3) | |
394 | diff1 = request.GET.get('diff1', '') |
|
394 | diff1 = request.GET.get('diff1', '') | |
395 | diff2 = request.GET.get('diff2', '') |
|
395 | diff2 = request.GET.get('diff2', '') | |
396 | c.action = request.GET.get('diff') |
|
396 | c.action = request.GET.get('diff') | |
397 | c.no_changes = diff1 == diff2 |
|
397 | c.no_changes = diff1 == diff2 | |
398 | c.f_path = f_path |
|
398 | c.f_path = f_path | |
399 | c.big_diff = False |
|
399 | c.big_diff = False | |
400 | c.anchor_url = anchor_url |
|
400 | c.anchor_url = anchor_url | |
401 | c.ignorews_url = _ignorews_url |
|
401 | c.ignorews_url = _ignorews_url | |
402 | c.context_url = _context_url |
|
402 | c.context_url = _context_url | |
403 | c.changes = OrderedDict() |
|
403 | c.changes = OrderedDict() | |
404 | c.changes[diff2] = [] |
|
404 | c.changes[diff2] = [] | |
405 | try: |
|
405 | try: | |
406 | if diff1 not in ['', None, 'None', '0' * 12, '0' * 40]: |
|
406 | if diff1 not in ['', None, 'None', '0' * 12, '0' * 40]: | |
407 | c.changeset_1 = c.rhodecode_repo.get_changeset(diff1) |
|
407 | c.changeset_1 = c.rhodecode_repo.get_changeset(diff1) | |
408 | node1 = c.changeset_1.get_node(f_path) |
|
408 | node1 = c.changeset_1.get_node(f_path) | |
409 | else: |
|
409 | else: | |
410 | c.changeset_1 = EmptyChangeset(repo=c.rhodecode_repo) |
|
410 | c.changeset_1 = EmptyChangeset(repo=c.rhodecode_repo) | |
411 | node1 = FileNode('.', '', changeset=c.changeset_1) |
|
411 | node1 = FileNode('.', '', changeset=c.changeset_1) | |
412 |
|
412 | |||
413 | if diff2 not in ['', None, 'None', '0' * 12, '0' * 40]: |
|
413 | if diff2 not in ['', None, 'None', '0' * 12, '0' * 40]: | |
414 | c.changeset_2 = c.rhodecode_repo.get_changeset(diff2) |
|
414 | c.changeset_2 = c.rhodecode_repo.get_changeset(diff2) | |
415 | node2 = c.changeset_2.get_node(f_path) |
|
415 | node2 = c.changeset_2.get_node(f_path) | |
416 | else: |
|
416 | else: | |
417 | c.changeset_2 = EmptyChangeset(repo=c.rhodecode_repo) |
|
417 | c.changeset_2 = EmptyChangeset(repo=c.rhodecode_repo) | |
418 | node2 = FileNode('.', '', changeset=c.changeset_2) |
|
418 | node2 = FileNode('.', '', changeset=c.changeset_2) | |
419 | except RepositoryError: |
|
419 | except RepositoryError: | |
420 | return redirect(url('files_home', repo_name=c.repo_name, |
|
420 | return redirect(url('files_home', repo_name=c.repo_name, | |
421 | f_path=f_path)) |
|
421 | f_path=f_path)) | |
422 |
|
422 | |||
423 | if c.action == 'download': |
|
423 | if c.action == 'download': | |
424 | _diff = diffs.get_gitdiff(node1, node2, |
|
424 | _diff = diffs.get_gitdiff(node1, node2, | |
425 | ignore_whitespace=ignore_whitespace, |
|
425 | ignore_whitespace=ignore_whitespace, | |
426 | context=line_context) |
|
426 | context=line_context) | |
427 | diff = diffs.DiffProcessor(_diff, format='gitdiff') |
|
427 | diff = diffs.DiffProcessor(_diff, format='gitdiff') | |
428 |
|
428 | |||
429 | diff_name = '%s_vs_%s.diff' % (diff1, diff2) |
|
429 | diff_name = '%s_vs_%s.diff' % (diff1, diff2) | |
430 | response.content_type = 'text/plain' |
|
430 | response.content_type = 'text/plain' | |
431 |
response.content_disposition = |
|
431 | response.content_disposition = ( | |
432 | % diff_name |
|
432 | 'attachment; filename=%s' % diff_name | |
|
433 | ) | |||
433 | return diff.raw_diff() |
|
434 | return diff.raw_diff() | |
434 |
|
435 | |||
435 | elif c.action == 'raw': |
|
436 | elif c.action == 'raw': | |
436 | _diff = diffs.get_gitdiff(node1, node2, |
|
437 | _diff = diffs.get_gitdiff(node1, node2, | |
437 | ignore_whitespace=ignore_whitespace, |
|
438 | ignore_whitespace=ignore_whitespace, | |
438 | context=line_context) |
|
439 | context=line_context) | |
439 | diff = diffs.DiffProcessor(_diff, format='gitdiff') |
|
440 | diff = diffs.DiffProcessor(_diff, format='gitdiff') | |
440 | response.content_type = 'text/plain' |
|
441 | response.content_type = 'text/plain' | |
441 | return diff.raw_diff() |
|
442 | return diff.raw_diff() | |
442 |
|
443 | |||
443 | else: |
|
444 | else: | |
444 | fid = h.FID(diff2, node2.path) |
|
445 | fid = h.FID(diff2, node2.path) | |
445 | line_context_lcl = get_line_ctx(fid, request.GET) |
|
446 | line_context_lcl = get_line_ctx(fid, request.GET) | |
446 | ign_whitespace_lcl = get_ignore_ws(fid, request.GET) |
|
447 | ign_whitespace_lcl = get_ignore_ws(fid, request.GET) | |
447 |
|
448 | |||
448 | lim = request.GET.get('fulldiff') or self.cut_off_limit |
|
449 | lim = request.GET.get('fulldiff') or self.cut_off_limit | |
449 | _, cs1, cs2, diff, st = wrapped_diff(filenode_old=node1, |
|
450 | _, cs1, cs2, diff, st = wrapped_diff(filenode_old=node1, | |
450 | filenode_new=node2, |
|
451 | filenode_new=node2, | |
451 | cut_off_limit=lim, |
|
452 | cut_off_limit=lim, | |
452 | ignore_whitespace=ign_whitespace_lcl, |
|
453 | ignore_whitespace=ign_whitespace_lcl, | |
453 | line_context=line_context_lcl, |
|
454 | line_context=line_context_lcl, | |
454 | enable_comments=False) |
|
455 | enable_comments=False) | |
455 |
|
456 | |||
456 | c.changes = [('', node2, diff, cs1, cs2, st,)] |
|
457 | c.changes = [('', node2, diff, cs1, cs2, st,)] | |
457 |
|
458 | |||
458 | return render('files/file_diff.html') |
|
459 | return render('files/file_diff.html') | |
459 |
|
460 | |||
460 | def _get_node_history(self, cs, f_path): |
|
461 | def _get_node_history(self, cs, f_path): | |
461 | changesets = cs.get_file_history(f_path) |
|
462 | changesets = cs.get_file_history(f_path) | |
462 | hist_l = [] |
|
463 | hist_l = [] | |
463 |
|
464 | |||
464 | changesets_group = ([], _("Changesets")) |
|
465 | changesets_group = ([], _("Changesets")) | |
465 | branches_group = ([], _("Branches")) |
|
466 | branches_group = ([], _("Branches")) | |
466 | tags_group = ([], _("Tags")) |
|
467 | tags_group = ([], _("Tags")) | |
467 | _hg = cs.repository.alias == 'hg' |
|
468 | _hg = cs.repository.alias == 'hg' | |
468 | for chs in changesets: |
|
469 | for chs in changesets: | |
469 | _branch = '(%s)' % chs.branch if _hg else '' |
|
470 | _branch = '(%s)' % chs.branch if _hg else '' | |
470 | n_desc = 'r%s:%s %s' % (chs.revision, chs.short_id, _branch) |
|
471 | n_desc = 'r%s:%s %s' % (chs.revision, chs.short_id, _branch) | |
471 | changesets_group[0].append((chs.raw_id, n_desc,)) |
|
472 | changesets_group[0].append((chs.raw_id, n_desc,)) | |
472 |
|
473 | |||
473 | hist_l.append(changesets_group) |
|
474 | hist_l.append(changesets_group) | |
474 |
|
475 | |||
475 | for name, chs in c.rhodecode_repo.branches.items(): |
|
476 | for name, chs in c.rhodecode_repo.branches.items(): | |
476 | branches_group[0].append((chs, name),) |
|
477 | branches_group[0].append((chs, name),) | |
477 | hist_l.append(branches_group) |
|
478 | hist_l.append(branches_group) | |
478 |
|
479 | |||
479 | for name, chs in c.rhodecode_repo.tags.items(): |
|
480 | for name, chs in c.rhodecode_repo.tags.items(): | |
480 | tags_group[0].append((chs, name),) |
|
481 | tags_group[0].append((chs, name),) | |
481 | hist_l.append(tags_group) |
|
482 | hist_l.append(tags_group) | |
482 |
|
483 | |||
483 | return hist_l |
|
484 | return hist_l | |
484 |
|
485 | |||
485 | @jsonify |
|
486 | @jsonify | |
486 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', |
|
487 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', | |
487 | 'repository.admin') |
|
488 | 'repository.admin') | |
488 | def nodelist(self, repo_name, revision, f_path): |
|
489 | def nodelist(self, repo_name, revision, f_path): | |
489 | if request.environ.get('HTTP_X_PARTIAL_XHR'): |
|
490 | if request.environ.get('HTTP_X_PARTIAL_XHR'): | |
490 | cs = self.__get_cs_or_redirect(revision, repo_name) |
|
491 | cs = self.__get_cs_or_redirect(revision, repo_name) | |
491 | _d, _f = ScmModel().get_nodes(repo_name, cs.raw_id, f_path, |
|
492 | _d, _f = ScmModel().get_nodes(repo_name, cs.raw_id, f_path, | |
492 | flat=False) |
|
493 | flat=False) | |
493 | return _d + _f |
|
494 | return _d + _f |
@@ -1,517 +1,517 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | """ |
|
2 | """ | |
3 | rhodecode.lib.diffs |
|
3 | rhodecode.lib.diffs | |
4 | ~~~~~~~~~~~~~~~~~~~ |
|
4 | ~~~~~~~~~~~~~~~~~~~ | |
5 |
|
5 | |||
6 | Set of diffing helpers, previously part of vcs |
|
6 | Set of diffing helpers, previously part of vcs | |
7 |
|
7 | |||
8 |
|
8 | |||
9 | :created_on: Dec 4, 2011 |
|
9 | :created_on: Dec 4, 2011 | |
10 | :author: marcink |
|
10 | :author: marcink | |
11 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> |
|
11 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> | |
12 | :original copyright: 2007-2008 by Armin Ronacher |
|
12 | :original copyright: 2007-2008 by Armin Ronacher | |
13 | :license: GPLv3, see COPYING for more details. |
|
13 | :license: GPLv3, see COPYING for more details. | |
14 | """ |
|
14 | """ | |
15 | # This program is free software: you can redistribute it and/or modify |
|
15 | # This program is free software: you can redistribute it and/or modify | |
16 | # it under the terms of the GNU General Public License as published by |
|
16 | # it under the terms of the GNU General Public License as published by | |
17 | # the Free Software Foundation, either version 3 of the License, or |
|
17 | # the Free Software Foundation, either version 3 of the License, or | |
18 | # (at your option) any later version. |
|
18 | # (at your option) any later version. | |
19 | # |
|
19 | # | |
20 | # This program is distributed in the hope that it will be useful, |
|
20 | # This program is distributed in the hope that it will be useful, | |
21 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
21 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
22 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
22 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
23 | # GNU General Public License for more details. |
|
23 | # GNU General Public License for more details. | |
24 | # |
|
24 | # | |
25 | # You should have received a copy of the GNU General Public License |
|
25 | # You should have received a copy of the GNU General Public License | |
26 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
26 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
27 |
|
27 | |||
28 | import re |
|
28 | import re | |
29 | import difflib |
|
29 | import difflib | |
30 | import markupsafe |
|
30 | import markupsafe | |
31 | from itertools import tee, imap |
|
31 | from itertools import tee, imap | |
32 |
|
32 | |||
33 | from pylons.i18n.translation import _ |
|
33 | from pylons.i18n.translation import _ | |
34 |
|
34 | |||
35 | from rhodecode.lib.vcs.exceptions import VCSError |
|
35 | from rhodecode.lib.vcs.exceptions import VCSError | |
36 | from rhodecode.lib.vcs.nodes import FileNode |
|
36 | from rhodecode.lib.vcs.nodes import FileNode | |
37 |
|
37 | |||
38 | from rhodecode.lib.utils import EmptyChangeset |
|
38 | from rhodecode.lib.utils import EmptyChangeset | |
39 |
|
39 | |||
40 |
|
40 | |||
41 | def wrap_to_table(str_): |
|
41 | def wrap_to_table(str_): | |
42 | return '''<table class="code-difftable"> |
|
42 | return '''<table class="code-difftable"> | |
43 | <tr class="line no-comment"> |
|
43 | <tr class="line no-comment"> | |
44 | <td class="lineno new"></td> |
|
44 | <td class="lineno new"></td> | |
45 | <td class="code no-comment"><pre>%s</pre></td> |
|
45 | <td class="code no-comment"><pre>%s</pre></td> | |
46 | </tr> |
|
46 | </tr> | |
47 | </table>''' % str_ |
|
47 | </table>''' % str_ | |
48 |
|
48 | |||
49 |
|
49 | |||
50 | def wrapped_diff(filenode_old, filenode_new, cut_off_limit=None, |
|
50 | def wrapped_diff(filenode_old, filenode_new, cut_off_limit=None, | |
51 | ignore_whitespace=True, line_context=3, |
|
51 | ignore_whitespace=True, line_context=3, | |
52 | enable_comments=False): |
|
52 | enable_comments=False): | |
53 | """ |
|
53 | """ | |
54 | returns a wrapped diff into a table, checks for cut_off_limit and presents |
|
54 | returns a wrapped diff into a table, checks for cut_off_limit and presents | |
55 | proper message |
|
55 | proper message | |
56 | """ |
|
56 | """ | |
57 |
|
57 | |||
58 | if filenode_old is None: |
|
58 | if filenode_old is None: | |
59 | filenode_old = FileNode(filenode_new.path, '', EmptyChangeset()) |
|
59 | filenode_old = FileNode(filenode_new.path, '', EmptyChangeset()) | |
60 |
|
60 | |||
61 | if filenode_old.is_binary or filenode_new.is_binary: |
|
61 | if filenode_old.is_binary or filenode_new.is_binary: | |
62 | diff = wrap_to_table(_('binary file')) |
|
62 | diff = wrap_to_table(_('binary file')) | |
63 | stats = (0, 0) |
|
63 | stats = (0, 0) | |
64 | size = 0 |
|
64 | size = 0 | |
65 |
|
65 | |||
66 | elif cut_off_limit != -1 and (cut_off_limit is None or |
|
66 | elif cut_off_limit != -1 and (cut_off_limit is None or | |
67 | (filenode_old.size < cut_off_limit and filenode_new.size < cut_off_limit)): |
|
67 | (filenode_old.size < cut_off_limit and filenode_new.size < cut_off_limit)): | |
68 |
|
68 | |||
69 | f_gitdiff = get_gitdiff(filenode_old, filenode_new, |
|
69 | f_gitdiff = get_gitdiff(filenode_old, filenode_new, | |
70 | ignore_whitespace=ignore_whitespace, |
|
70 | ignore_whitespace=ignore_whitespace, | |
71 | context=line_context) |
|
71 | context=line_context) | |
72 | diff_processor = DiffProcessor(f_gitdiff, format='gitdiff') |
|
72 | diff_processor = DiffProcessor(f_gitdiff, format='gitdiff') | |
73 |
|
73 | |||
74 | diff = diff_processor.as_html(enable_comments=enable_comments) |
|
74 | diff = diff_processor.as_html(enable_comments=enable_comments) | |
75 | stats = diff_processor.stat() |
|
75 | stats = diff_processor.stat() | |
76 | size = len(diff or '') |
|
76 | size = len(diff or '') | |
77 | else: |
|
77 | else: | |
78 | diff = wrap_to_table(_('Changeset was to big and was cut off, use ' |
|
78 | diff = wrap_to_table(_('Changeset was to big and was cut off, use ' | |
79 | 'diff menu to display this diff')) |
|
79 | 'diff menu to display this diff')) | |
80 | stats = (0, 0) |
|
80 | stats = (0, 0) | |
81 | size = 0 |
|
81 | size = 0 | |
82 |
|
82 | |||
83 | if not diff: |
|
83 | if not diff: | |
84 | diff = wrap_to_table(_('No changes detected')) |
|
84 | diff = wrap_to_table(_('No changes detected')) | |
85 |
|
85 | |||
86 |
cs1 = filenode_old. |
|
86 | cs1 = filenode_old.changeset.raw_id | |
87 |
cs2 = filenode_new. |
|
87 | cs2 = filenode_new.changeset.raw_id | |
88 |
|
88 | |||
89 | return size, cs1, cs2, diff, stats |
|
89 | return size, cs1, cs2, diff, stats | |
90 |
|
90 | |||
91 |
|
91 | |||
92 | def get_gitdiff(filenode_old, filenode_new, ignore_whitespace=True, context=3): |
|
92 | def get_gitdiff(filenode_old, filenode_new, ignore_whitespace=True, context=3): | |
93 | """ |
|
93 | """ | |
94 | Returns git style diff between given ``filenode_old`` and ``filenode_new``. |
|
94 | Returns git style diff between given ``filenode_old`` and ``filenode_new``. | |
95 |
|
95 | |||
96 | :param ignore_whitespace: ignore whitespaces in diff |
|
96 | :param ignore_whitespace: ignore whitespaces in diff | |
97 | """ |
|
97 | """ | |
98 | # make sure we pass in default context |
|
98 | # make sure we pass in default context | |
99 | context = context or 3 |
|
99 | context = context or 3 | |
100 |
|
100 | |||
101 | for filenode in (filenode_old, filenode_new): |
|
101 | for filenode in (filenode_old, filenode_new): | |
102 | if not isinstance(filenode, FileNode): |
|
102 | if not isinstance(filenode, FileNode): | |
103 | raise VCSError("Given object should be FileNode object, not %s" |
|
103 | raise VCSError("Given object should be FileNode object, not %s" | |
104 | % filenode.__class__) |
|
104 | % filenode.__class__) | |
105 |
|
105 | |||
106 | repo = filenode_new.changeset.repository |
|
106 | repo = filenode_new.changeset.repository | |
107 | old_raw_id = getattr(filenode_old.changeset, 'raw_id', repo.EMPTY_CHANGESET) |
|
107 | old_raw_id = getattr(filenode_old.changeset, 'raw_id', repo.EMPTY_CHANGESET) | |
108 | new_raw_id = getattr(filenode_new.changeset, 'raw_id', repo.EMPTY_CHANGESET) |
|
108 | new_raw_id = getattr(filenode_new.changeset, 'raw_id', repo.EMPTY_CHANGESET) | |
109 |
|
109 | |||
110 | vcs_gitdiff = repo.get_diff(old_raw_id, new_raw_id, filenode_new.path, |
|
110 | vcs_gitdiff = repo.get_diff(old_raw_id, new_raw_id, filenode_new.path, | |
111 | ignore_whitespace, context) |
|
111 | ignore_whitespace, context) | |
112 |
|
112 | |||
113 | return vcs_gitdiff |
|
113 | return vcs_gitdiff | |
114 |
|
114 | |||
115 |
|
115 | |||
116 | class DiffProcessor(object): |
|
116 | class DiffProcessor(object): | |
117 | """ |
|
117 | """ | |
118 | Give it a unified diff and it returns a list of the files that were |
|
118 | Give it a unified diff and it returns a list of the files that were | |
119 | mentioned in the diff together with a dict of meta information that |
|
119 | mentioned in the diff together with a dict of meta information that | |
120 | can be used to render it in a HTML template. |
|
120 | can be used to render it in a HTML template. | |
121 | """ |
|
121 | """ | |
122 | _chunk_re = re.compile(r'@@ -(\d+)(?:,(\d+))? \+(\d+)(?:,(\d+))? @@(.*)') |
|
122 | _chunk_re = re.compile(r'@@ -(\d+)(?:,(\d+))? \+(\d+)(?:,(\d+))? @@(.*)') | |
123 |
|
123 | |||
124 | def __init__(self, diff, differ='diff', format='udiff'): |
|
124 | def __init__(self, diff, differ='diff', format='udiff'): | |
125 | """ |
|
125 | """ | |
126 | :param diff: a text in diff format or generator |
|
126 | :param diff: a text in diff format or generator | |
127 | :param format: format of diff passed, `udiff` or `gitdiff` |
|
127 | :param format: format of diff passed, `udiff` or `gitdiff` | |
128 | """ |
|
128 | """ | |
129 | if isinstance(diff, basestring): |
|
129 | if isinstance(diff, basestring): | |
130 | diff = [diff] |
|
130 | diff = [diff] | |
131 |
|
131 | |||
132 | self.__udiff = diff |
|
132 | self.__udiff = diff | |
133 | self.__format = format |
|
133 | self.__format = format | |
134 | self.adds = 0 |
|
134 | self.adds = 0 | |
135 | self.removes = 0 |
|
135 | self.removes = 0 | |
136 |
|
136 | |||
137 | if isinstance(self.__udiff, basestring): |
|
137 | if isinstance(self.__udiff, basestring): | |
138 | self.lines = iter(self.__udiff.splitlines(1)) |
|
138 | self.lines = iter(self.__udiff.splitlines(1)) | |
139 |
|
139 | |||
140 | elif self.__format == 'gitdiff': |
|
140 | elif self.__format == 'gitdiff': | |
141 | udiff_copy = self.copy_iterator() |
|
141 | udiff_copy = self.copy_iterator() | |
142 | self.lines = imap(self.escaper, self._parse_gitdiff(udiff_copy)) |
|
142 | self.lines = imap(self.escaper, self._parse_gitdiff(udiff_copy)) | |
143 | else: |
|
143 | else: | |
144 | udiff_copy = self.copy_iterator() |
|
144 | udiff_copy = self.copy_iterator() | |
145 | self.lines = imap(self.escaper, udiff_copy) |
|
145 | self.lines = imap(self.escaper, udiff_copy) | |
146 |
|
146 | |||
147 | # Select a differ. |
|
147 | # Select a differ. | |
148 | if differ == 'difflib': |
|
148 | if differ == 'difflib': | |
149 | self.differ = self._highlight_line_difflib |
|
149 | self.differ = self._highlight_line_difflib | |
150 | else: |
|
150 | else: | |
151 | self.differ = self._highlight_line_udiff |
|
151 | self.differ = self._highlight_line_udiff | |
152 |
|
152 | |||
153 | def escaper(self, string): |
|
153 | def escaper(self, string): | |
154 | return markupsafe.escape(string) |
|
154 | return markupsafe.escape(string) | |
155 |
|
155 | |||
156 | def copy_iterator(self): |
|
156 | def copy_iterator(self): | |
157 | """ |
|
157 | """ | |
158 | make a fresh copy of generator, we should not iterate thru |
|
158 | make a fresh copy of generator, we should not iterate thru | |
159 | an original as it's needed for repeating operations on |
|
159 | an original as it's needed for repeating operations on | |
160 | this instance of DiffProcessor |
|
160 | this instance of DiffProcessor | |
161 | """ |
|
161 | """ | |
162 | self.__udiff, iterator_copy = tee(self.__udiff) |
|
162 | self.__udiff, iterator_copy = tee(self.__udiff) | |
163 | return iterator_copy |
|
163 | return iterator_copy | |
164 |
|
164 | |||
165 | def _extract_rev(self, line1, line2): |
|
165 | def _extract_rev(self, line1, line2): | |
166 | """ |
|
166 | """ | |
167 | Extract the filename and revision hint from a line. |
|
167 | Extract the filename and revision hint from a line. | |
168 | """ |
|
168 | """ | |
169 |
|
169 | |||
170 | try: |
|
170 | try: | |
171 | if line1.startswith('--- ') and line2.startswith('+++ '): |
|
171 | if line1.startswith('--- ') and line2.startswith('+++ '): | |
172 | l1 = line1[4:].split(None, 1) |
|
172 | l1 = line1[4:].split(None, 1) | |
173 | old_filename = (l1[0].replace('a/', '', 1) |
|
173 | old_filename = (l1[0].replace('a/', '', 1) | |
174 | if len(l1) >= 1 else None) |
|
174 | if len(l1) >= 1 else None) | |
175 | old_rev = l1[1] if len(l1) == 2 else 'old' |
|
175 | old_rev = l1[1] if len(l1) == 2 else 'old' | |
176 |
|
176 | |||
177 | l2 = line2[4:].split(None, 1) |
|
177 | l2 = line2[4:].split(None, 1) | |
178 | new_filename = (l2[0].replace('b/', '', 1) |
|
178 | new_filename = (l2[0].replace('b/', '', 1) | |
179 | if len(l1) >= 1 else None) |
|
179 | if len(l1) >= 1 else None) | |
180 | new_rev = l2[1] if len(l2) == 2 else 'new' |
|
180 | new_rev = l2[1] if len(l2) == 2 else 'new' | |
181 |
|
181 | |||
182 | filename = (old_filename |
|
182 | filename = (old_filename | |
183 | if old_filename != '/dev/null' else new_filename) |
|
183 | if old_filename != '/dev/null' else new_filename) | |
184 |
|
184 | |||
185 | return filename, new_rev, old_rev |
|
185 | return filename, new_rev, old_rev | |
186 | except (ValueError, IndexError): |
|
186 | except (ValueError, IndexError): | |
187 | pass |
|
187 | pass | |
188 |
|
188 | |||
189 | return None, None, None |
|
189 | return None, None, None | |
190 |
|
190 | |||
191 | def _parse_gitdiff(self, diffiterator): |
|
191 | def _parse_gitdiff(self, diffiterator): | |
192 | def line_decoder(l): |
|
192 | def line_decoder(l): | |
193 | if l.startswith('+') and not l.startswith('+++'): |
|
193 | if l.startswith('+') and not l.startswith('+++'): | |
194 | self.adds += 1 |
|
194 | self.adds += 1 | |
195 | elif l.startswith('-') and not l.startswith('---'): |
|
195 | elif l.startswith('-') and not l.startswith('---'): | |
196 | self.removes += 1 |
|
196 | self.removes += 1 | |
197 | return l.decode('utf8', 'replace') |
|
197 | return l.decode('utf8', 'replace') | |
198 |
|
198 | |||
199 | output = list(diffiterator) |
|
199 | output = list(diffiterator) | |
200 | size = len(output) |
|
200 | size = len(output) | |
201 |
|
201 | |||
202 | if size == 2: |
|
202 | if size == 2: | |
203 | l = [] |
|
203 | l = [] | |
204 | l.extend([output[0]]) |
|
204 | l.extend([output[0]]) | |
205 | l.extend(output[1].splitlines(1)) |
|
205 | l.extend(output[1].splitlines(1)) | |
206 | return map(line_decoder, l) |
|
206 | return map(line_decoder, l) | |
207 | elif size == 1: |
|
207 | elif size == 1: | |
208 | return map(line_decoder, output[0].splitlines(1)) |
|
208 | return map(line_decoder, output[0].splitlines(1)) | |
209 | elif size == 0: |
|
209 | elif size == 0: | |
210 | return [] |
|
210 | return [] | |
211 |
|
211 | |||
212 | raise Exception('wrong size of diff %s' % size) |
|
212 | raise Exception('wrong size of diff %s' % size) | |
213 |
|
213 | |||
214 | def _highlight_line_difflib(self, line, next_): |
|
214 | def _highlight_line_difflib(self, line, next_): | |
215 | """ |
|
215 | """ | |
216 | Highlight inline changes in both lines. |
|
216 | Highlight inline changes in both lines. | |
217 | """ |
|
217 | """ | |
218 |
|
218 | |||
219 | if line['action'] == 'del': |
|
219 | if line['action'] == 'del': | |
220 | old, new = line, next_ |
|
220 | old, new = line, next_ | |
221 | else: |
|
221 | else: | |
222 | old, new = next_, line |
|
222 | old, new = next_, line | |
223 |
|
223 | |||
224 | oldwords = re.split(r'(\W)', old['line']) |
|
224 | oldwords = re.split(r'(\W)', old['line']) | |
225 | newwords = re.split(r'(\W)', new['line']) |
|
225 | newwords = re.split(r'(\W)', new['line']) | |
226 |
|
226 | |||
227 | sequence = difflib.SequenceMatcher(None, oldwords, newwords) |
|
227 | sequence = difflib.SequenceMatcher(None, oldwords, newwords) | |
228 |
|
228 | |||
229 | oldfragments, newfragments = [], [] |
|
229 | oldfragments, newfragments = [], [] | |
230 | for tag, i1, i2, j1, j2 in sequence.get_opcodes(): |
|
230 | for tag, i1, i2, j1, j2 in sequence.get_opcodes(): | |
231 | oldfrag = ''.join(oldwords[i1:i2]) |
|
231 | oldfrag = ''.join(oldwords[i1:i2]) | |
232 | newfrag = ''.join(newwords[j1:j2]) |
|
232 | newfrag = ''.join(newwords[j1:j2]) | |
233 | if tag != 'equal': |
|
233 | if tag != 'equal': | |
234 | if oldfrag: |
|
234 | if oldfrag: | |
235 | oldfrag = '<del>%s</del>' % oldfrag |
|
235 | oldfrag = '<del>%s</del>' % oldfrag | |
236 | if newfrag: |
|
236 | if newfrag: | |
237 | newfrag = '<ins>%s</ins>' % newfrag |
|
237 | newfrag = '<ins>%s</ins>' % newfrag | |
238 | oldfragments.append(oldfrag) |
|
238 | oldfragments.append(oldfrag) | |
239 | newfragments.append(newfrag) |
|
239 | newfragments.append(newfrag) | |
240 |
|
240 | |||
241 | old['line'] = "".join(oldfragments) |
|
241 | old['line'] = "".join(oldfragments) | |
242 | new['line'] = "".join(newfragments) |
|
242 | new['line'] = "".join(newfragments) | |
243 |
|
243 | |||
244 | def _highlight_line_udiff(self, line, next_): |
|
244 | def _highlight_line_udiff(self, line, next_): | |
245 | """ |
|
245 | """ | |
246 | Highlight inline changes in both lines. |
|
246 | Highlight inline changes in both lines. | |
247 | """ |
|
247 | """ | |
248 | start = 0 |
|
248 | start = 0 | |
249 | limit = min(len(line['line']), len(next_['line'])) |
|
249 | limit = min(len(line['line']), len(next_['line'])) | |
250 | while start < limit and line['line'][start] == next_['line'][start]: |
|
250 | while start < limit and line['line'][start] == next_['line'][start]: | |
251 | start += 1 |
|
251 | start += 1 | |
252 | end = -1 |
|
252 | end = -1 | |
253 | limit -= start |
|
253 | limit -= start | |
254 | while -end <= limit and line['line'][end] == next_['line'][end]: |
|
254 | while -end <= limit and line['line'][end] == next_['line'][end]: | |
255 | end -= 1 |
|
255 | end -= 1 | |
256 | end += 1 |
|
256 | end += 1 | |
257 | if start or end: |
|
257 | if start or end: | |
258 | def do(l): |
|
258 | def do(l): | |
259 | last = end + len(l['line']) |
|
259 | last = end + len(l['line']) | |
260 | if l['action'] == 'add': |
|
260 | if l['action'] == 'add': | |
261 | tag = 'ins' |
|
261 | tag = 'ins' | |
262 | else: |
|
262 | else: | |
263 | tag = 'del' |
|
263 | tag = 'del' | |
264 | l['line'] = '%s<%s>%s</%s>%s' % ( |
|
264 | l['line'] = '%s<%s>%s</%s>%s' % ( | |
265 | l['line'][:start], |
|
265 | l['line'][:start], | |
266 | tag, |
|
266 | tag, | |
267 | l['line'][start:last], |
|
267 | l['line'][start:last], | |
268 | tag, |
|
268 | tag, | |
269 | l['line'][last:] |
|
269 | l['line'][last:] | |
270 | ) |
|
270 | ) | |
271 | do(line) |
|
271 | do(line) | |
272 | do(next_) |
|
272 | do(next_) | |
273 |
|
273 | |||
274 | def _parse_udiff(self): |
|
274 | def _parse_udiff(self): | |
275 | """ |
|
275 | """ | |
276 | Parse the diff an return data for the template. |
|
276 | Parse the diff an return data for the template. | |
277 | """ |
|
277 | """ | |
278 | lineiter = self.lines |
|
278 | lineiter = self.lines | |
279 | files = [] |
|
279 | files = [] | |
280 | try: |
|
280 | try: | |
281 | line = lineiter.next() |
|
281 | line = lineiter.next() | |
282 | # skip first context |
|
282 | # skip first context | |
283 | skipfirst = True |
|
283 | skipfirst = True | |
284 | while 1: |
|
284 | while 1: | |
285 | # continue until we found the old file |
|
285 | # continue until we found the old file | |
286 | if not line.startswith('--- '): |
|
286 | if not line.startswith('--- '): | |
287 | line = lineiter.next() |
|
287 | line = lineiter.next() | |
288 | continue |
|
288 | continue | |
289 |
|
289 | |||
290 | chunks = [] |
|
290 | chunks = [] | |
291 | filename, old_rev, new_rev = \ |
|
291 | filename, old_rev, new_rev = \ | |
292 | self._extract_rev(line, lineiter.next()) |
|
292 | self._extract_rev(line, lineiter.next()) | |
293 | files.append({ |
|
293 | files.append({ | |
294 | 'filename': filename, |
|
294 | 'filename': filename, | |
295 | 'old_revision': old_rev, |
|
295 | 'old_revision': old_rev, | |
296 | 'new_revision': new_rev, |
|
296 | 'new_revision': new_rev, | |
297 | 'chunks': chunks |
|
297 | 'chunks': chunks | |
298 | }) |
|
298 | }) | |
299 |
|
299 | |||
300 | line = lineiter.next() |
|
300 | line = lineiter.next() | |
301 | while line: |
|
301 | while line: | |
302 | match = self._chunk_re.match(line) |
|
302 | match = self._chunk_re.match(line) | |
303 | if not match: |
|
303 | if not match: | |
304 | break |
|
304 | break | |
305 |
|
305 | |||
306 | lines = [] |
|
306 | lines = [] | |
307 | chunks.append(lines) |
|
307 | chunks.append(lines) | |
308 |
|
308 | |||
309 | old_line, old_end, new_line, new_end = \ |
|
309 | old_line, old_end, new_line, new_end = \ | |
310 | [int(x or 1) for x in match.groups()[:-1]] |
|
310 | [int(x or 1) for x in match.groups()[:-1]] | |
311 | old_line -= 1 |
|
311 | old_line -= 1 | |
312 | new_line -= 1 |
|
312 | new_line -= 1 | |
313 | context = len(match.groups()) == 5 |
|
313 | context = len(match.groups()) == 5 | |
314 | old_end += old_line |
|
314 | old_end += old_line | |
315 | new_end += new_line |
|
315 | new_end += new_line | |
316 |
|
316 | |||
317 | if context: |
|
317 | if context: | |
318 | if not skipfirst: |
|
318 | if not skipfirst: | |
319 | lines.append({ |
|
319 | lines.append({ | |
320 | 'old_lineno': '...', |
|
320 | 'old_lineno': '...', | |
321 | 'new_lineno': '...', |
|
321 | 'new_lineno': '...', | |
322 | 'action': 'context', |
|
322 | 'action': 'context', | |
323 | 'line': line, |
|
323 | 'line': line, | |
324 | }) |
|
324 | }) | |
325 | else: |
|
325 | else: | |
326 | skipfirst = False |
|
326 | skipfirst = False | |
327 |
|
327 | |||
328 | line = lineiter.next() |
|
328 | line = lineiter.next() | |
329 | while old_line < old_end or new_line < new_end: |
|
329 | while old_line < old_end or new_line < new_end: | |
330 | if line: |
|
330 | if line: | |
331 | command, line = line[0], line[1:] |
|
331 | command, line = line[0], line[1:] | |
332 | else: |
|
332 | else: | |
333 | command = ' ' |
|
333 | command = ' ' | |
334 | affects_old = affects_new = False |
|
334 | affects_old = affects_new = False | |
335 |
|
335 | |||
336 | # ignore those if we don't expect them |
|
336 | # ignore those if we don't expect them | |
337 | if command in '#@': |
|
337 | if command in '#@': | |
338 | continue |
|
338 | continue | |
339 | elif command == '+': |
|
339 | elif command == '+': | |
340 | affects_new = True |
|
340 | affects_new = True | |
341 | action = 'add' |
|
341 | action = 'add' | |
342 | elif command == '-': |
|
342 | elif command == '-': | |
343 | affects_old = True |
|
343 | affects_old = True | |
344 | action = 'del' |
|
344 | action = 'del' | |
345 | else: |
|
345 | else: | |
346 | affects_old = affects_new = True |
|
346 | affects_old = affects_new = True | |
347 | action = 'unmod' |
|
347 | action = 'unmod' | |
348 |
|
348 | |||
349 | old_line += affects_old |
|
349 | old_line += affects_old | |
350 | new_line += affects_new |
|
350 | new_line += affects_new | |
351 | lines.append({ |
|
351 | lines.append({ | |
352 | 'old_lineno': affects_old and old_line or '', |
|
352 | 'old_lineno': affects_old and old_line or '', | |
353 | 'new_lineno': affects_new and new_line or '', |
|
353 | 'new_lineno': affects_new and new_line or '', | |
354 | 'action': action, |
|
354 | 'action': action, | |
355 | 'line': line |
|
355 | 'line': line | |
356 | }) |
|
356 | }) | |
357 | line = lineiter.next() |
|
357 | line = lineiter.next() | |
358 |
|
358 | |||
359 | except StopIteration: |
|
359 | except StopIteration: | |
360 | pass |
|
360 | pass | |
361 |
|
361 | |||
362 | # highlight inline changes |
|
362 | # highlight inline changes | |
363 | for _ in files: |
|
363 | for _ in files: | |
364 | for chunk in chunks: |
|
364 | for chunk in chunks: | |
365 | lineiter = iter(chunk) |
|
365 | lineiter = iter(chunk) | |
366 | #first = True |
|
366 | #first = True | |
367 | try: |
|
367 | try: | |
368 | while 1: |
|
368 | while 1: | |
369 | line = lineiter.next() |
|
369 | line = lineiter.next() | |
370 | if line['action'] != 'unmod': |
|
370 | if line['action'] != 'unmod': | |
371 | nextline = lineiter.next() |
|
371 | nextline = lineiter.next() | |
372 | if nextline['action'] == 'unmod' or \ |
|
372 | if nextline['action'] == 'unmod' or \ | |
373 | nextline['action'] == line['action']: |
|
373 | nextline['action'] == line['action']: | |
374 | continue |
|
374 | continue | |
375 | self.differ(line, nextline) |
|
375 | self.differ(line, nextline) | |
376 | except StopIteration: |
|
376 | except StopIteration: | |
377 | pass |
|
377 | pass | |
378 |
|
378 | |||
379 | return files |
|
379 | return files | |
380 |
|
380 | |||
381 | def prepare(self): |
|
381 | def prepare(self): | |
382 | """ |
|
382 | """ | |
383 | Prepare the passed udiff for HTML rendering. It'l return a list |
|
383 | Prepare the passed udiff for HTML rendering. It'l return a list | |
384 | of dicts |
|
384 | of dicts | |
385 | """ |
|
385 | """ | |
386 | return self._parse_udiff() |
|
386 | return self._parse_udiff() | |
387 |
|
387 | |||
388 | def _safe_id(self, idstring): |
|
388 | def _safe_id(self, idstring): | |
389 | """Make a string safe for including in an id attribute. |
|
389 | """Make a string safe for including in an id attribute. | |
390 |
|
390 | |||
391 | The HTML spec says that id attributes 'must begin with |
|
391 | The HTML spec says that id attributes 'must begin with | |
392 | a letter ([A-Za-z]) and may be followed by any number |
|
392 | a letter ([A-Za-z]) and may be followed by any number | |
393 | of letters, digits ([0-9]), hyphens ("-"), underscores |
|
393 | of letters, digits ([0-9]), hyphens ("-"), underscores | |
394 | ("_"), colons (":"), and periods (".")'. These regexps |
|
394 | ("_"), colons (":"), and periods (".")'. These regexps | |
395 | are slightly over-zealous, in that they remove colons |
|
395 | are slightly over-zealous, in that they remove colons | |
396 | and periods unnecessarily. |
|
396 | and periods unnecessarily. | |
397 |
|
397 | |||
398 | Whitespace is transformed into underscores, and then |
|
398 | Whitespace is transformed into underscores, and then | |
399 | anything which is not a hyphen or a character that |
|
399 | anything which is not a hyphen or a character that | |
400 | matches \w (alphanumerics and underscore) is removed. |
|
400 | matches \w (alphanumerics and underscore) is removed. | |
401 |
|
401 | |||
402 | """ |
|
402 | """ | |
403 | # Transform all whitespace to underscore |
|
403 | # Transform all whitespace to underscore | |
404 | idstring = re.sub(r'\s', "_", '%s' % idstring) |
|
404 | idstring = re.sub(r'\s', "_", '%s' % idstring) | |
405 | # Remove everything that is not a hyphen or a member of \w |
|
405 | # Remove everything that is not a hyphen or a member of \w | |
406 | idstring = re.sub(r'(?!-)\W', "", idstring).lower() |
|
406 | idstring = re.sub(r'(?!-)\W', "", idstring).lower() | |
407 | return idstring |
|
407 | return idstring | |
408 |
|
408 | |||
409 | def raw_diff(self): |
|
409 | def raw_diff(self): | |
410 | """ |
|
410 | """ | |
411 | Returns raw string as udiff |
|
411 | Returns raw string as udiff | |
412 | """ |
|
412 | """ | |
413 | udiff_copy = self.copy_iterator() |
|
413 | udiff_copy = self.copy_iterator() | |
414 | if self.__format == 'gitdiff': |
|
414 | if self.__format == 'gitdiff': | |
415 | udiff_copy = self._parse_gitdiff(udiff_copy) |
|
415 | udiff_copy = self._parse_gitdiff(udiff_copy) | |
416 | return u''.join(udiff_copy) |
|
416 | return u''.join(udiff_copy) | |
417 |
|
417 | |||
418 | def as_html(self, table_class='code-difftable', line_class='line', |
|
418 | def as_html(self, table_class='code-difftable', line_class='line', | |
419 | new_lineno_class='lineno old', old_lineno_class='lineno new', |
|
419 | new_lineno_class='lineno old', old_lineno_class='lineno new', | |
420 | code_class='code', enable_comments=False): |
|
420 | code_class='code', enable_comments=False): | |
421 | """ |
|
421 | """ | |
422 | Return udiff as html table with customized css classes |
|
422 | Return udiff as html table with customized css classes | |
423 | """ |
|
423 | """ | |
424 | def _link_to_if(condition, label, url): |
|
424 | def _link_to_if(condition, label, url): | |
425 | """ |
|
425 | """ | |
426 | Generates a link if condition is meet or just the label if not. |
|
426 | Generates a link if condition is meet or just the label if not. | |
427 | """ |
|
427 | """ | |
428 |
|
428 | |||
429 | if condition: |
|
429 | if condition: | |
430 | return '''<a href="%(url)s">%(label)s</a>''' % { |
|
430 | return '''<a href="%(url)s">%(label)s</a>''' % { | |
431 | 'url': url, |
|
431 | 'url': url, | |
432 | 'label': label |
|
432 | 'label': label | |
433 | } |
|
433 | } | |
434 | else: |
|
434 | else: | |
435 | return label |
|
435 | return label | |
436 | diff_lines = self.prepare() |
|
436 | diff_lines = self.prepare() | |
437 | _html_empty = True |
|
437 | _html_empty = True | |
438 | _html = [] |
|
438 | _html = [] | |
439 | _html.append('''<table class="%(table_class)s">\n''' % { |
|
439 | _html.append('''<table class="%(table_class)s">\n''' % { | |
440 | 'table_class': table_class |
|
440 | 'table_class': table_class | |
441 | }) |
|
441 | }) | |
442 | for diff in diff_lines: |
|
442 | for diff in diff_lines: | |
443 | for line in diff['chunks']: |
|
443 | for line in diff['chunks']: | |
444 | _html_empty = False |
|
444 | _html_empty = False | |
445 | for change in line: |
|
445 | for change in line: | |
446 | _html.append('''<tr class="%(lc)s %(action)s">\n''' % { |
|
446 | _html.append('''<tr class="%(lc)s %(action)s">\n''' % { | |
447 | 'lc': line_class, |
|
447 | 'lc': line_class, | |
448 | 'action': change['action'] |
|
448 | 'action': change['action'] | |
449 | }) |
|
449 | }) | |
450 | anchor_old_id = '' |
|
450 | anchor_old_id = '' | |
451 | anchor_new_id = '' |
|
451 | anchor_new_id = '' | |
452 | anchor_old = "%(filename)s_o%(oldline_no)s" % { |
|
452 | anchor_old = "%(filename)s_o%(oldline_no)s" % { | |
453 | 'filename': self._safe_id(diff['filename']), |
|
453 | 'filename': self._safe_id(diff['filename']), | |
454 | 'oldline_no': change['old_lineno'] |
|
454 | 'oldline_no': change['old_lineno'] | |
455 | } |
|
455 | } | |
456 | anchor_new = "%(filename)s_n%(oldline_no)s" % { |
|
456 | anchor_new = "%(filename)s_n%(oldline_no)s" % { | |
457 | 'filename': self._safe_id(diff['filename']), |
|
457 | 'filename': self._safe_id(diff['filename']), | |
458 | 'oldline_no': change['new_lineno'] |
|
458 | 'oldline_no': change['new_lineno'] | |
459 | } |
|
459 | } | |
460 | cond_old = (change['old_lineno'] != '...' and |
|
460 | cond_old = (change['old_lineno'] != '...' and | |
461 | change['old_lineno']) |
|
461 | change['old_lineno']) | |
462 | cond_new = (change['new_lineno'] != '...' and |
|
462 | cond_new = (change['new_lineno'] != '...' and | |
463 | change['new_lineno']) |
|
463 | change['new_lineno']) | |
464 | if cond_old: |
|
464 | if cond_old: | |
465 | anchor_old_id = 'id="%s"' % anchor_old |
|
465 | anchor_old_id = 'id="%s"' % anchor_old | |
466 | if cond_new: |
|
466 | if cond_new: | |
467 | anchor_new_id = 'id="%s"' % anchor_new |
|
467 | anchor_new_id = 'id="%s"' % anchor_new | |
468 | ########################################################### |
|
468 | ########################################################### | |
469 | # OLD LINE NUMBER |
|
469 | # OLD LINE NUMBER | |
470 | ########################################################### |
|
470 | ########################################################### | |
471 | _html.append('''\t<td %(a_id)s class="%(olc)s">''' % { |
|
471 | _html.append('''\t<td %(a_id)s class="%(olc)s">''' % { | |
472 | 'a_id': anchor_old_id, |
|
472 | 'a_id': anchor_old_id, | |
473 | 'olc': old_lineno_class |
|
473 | 'olc': old_lineno_class | |
474 | }) |
|
474 | }) | |
475 |
|
475 | |||
476 | _html.append('''%(link)s''' % { |
|
476 | _html.append('''%(link)s''' % { | |
477 | 'link': _link_to_if(True, change['old_lineno'], |
|
477 | 'link': _link_to_if(True, change['old_lineno'], | |
478 | '#%s' % anchor_old) |
|
478 | '#%s' % anchor_old) | |
479 | }) |
|
479 | }) | |
480 | _html.append('''</td>\n''') |
|
480 | _html.append('''</td>\n''') | |
481 | ########################################################### |
|
481 | ########################################################### | |
482 | # NEW LINE NUMBER |
|
482 | # NEW LINE NUMBER | |
483 | ########################################################### |
|
483 | ########################################################### | |
484 |
|
484 | |||
485 | _html.append('''\t<td %(a_id)s class="%(nlc)s">''' % { |
|
485 | _html.append('''\t<td %(a_id)s class="%(nlc)s">''' % { | |
486 | 'a_id': anchor_new_id, |
|
486 | 'a_id': anchor_new_id, | |
487 | 'nlc': new_lineno_class |
|
487 | 'nlc': new_lineno_class | |
488 | }) |
|
488 | }) | |
489 |
|
489 | |||
490 | _html.append('''%(link)s''' % { |
|
490 | _html.append('''%(link)s''' % { | |
491 | 'link': _link_to_if(True, change['new_lineno'], |
|
491 | 'link': _link_to_if(True, change['new_lineno'], | |
492 | '#%s' % anchor_new) |
|
492 | '#%s' % anchor_new) | |
493 | }) |
|
493 | }) | |
494 | _html.append('''</td>\n''') |
|
494 | _html.append('''</td>\n''') | |
495 | ########################################################### |
|
495 | ########################################################### | |
496 | # CODE |
|
496 | # CODE | |
497 | ########################################################### |
|
497 | ########################################################### | |
498 | comments = '' if enable_comments else 'no-comment' |
|
498 | comments = '' if enable_comments else 'no-comment' | |
499 | _html.append('''\t<td class="%(cc)s %(inc)s">''' % { |
|
499 | _html.append('''\t<td class="%(cc)s %(inc)s">''' % { | |
500 | 'cc': code_class, |
|
500 | 'cc': code_class, | |
501 | 'inc': comments |
|
501 | 'inc': comments | |
502 | }) |
|
502 | }) | |
503 | _html.append('''\n\t\t<pre>%(code)s</pre>\n''' % { |
|
503 | _html.append('''\n\t\t<pre>%(code)s</pre>\n''' % { | |
504 | 'code': change['line'] |
|
504 | 'code': change['line'] | |
505 | }) |
|
505 | }) | |
506 | _html.append('''\t</td>''') |
|
506 | _html.append('''\t</td>''') | |
507 | _html.append('''\n</tr>\n''') |
|
507 | _html.append('''\n</tr>\n''') | |
508 | _html.append('''</table>''') |
|
508 | _html.append('''</table>''') | |
509 | if _html_empty: |
|
509 | if _html_empty: | |
510 | return None |
|
510 | return None | |
511 | return ''.join(_html) |
|
511 | return ''.join(_html) | |
512 |
|
512 | |||
513 | def stat(self): |
|
513 | def stat(self): | |
514 | """ |
|
514 | """ | |
515 | Returns tuple of added, and removed lines for this instance |
|
515 | Returns tuple of added, and removed lines for this instance | |
516 | """ |
|
516 | """ | |
517 | return self.adds, self.removes |
|
517 | return self.adds, self.removes |
@@ -1,247 +1,250 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | """ |
|
2 | """ | |
3 | rhodecode.lib.middleware.simplegit |
|
3 | rhodecode.lib.middleware.simplegit | |
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
5 |
|
5 | |||
6 | SimpleGit middleware for handling git protocol request (push/clone etc.) |
|
6 | SimpleGit middleware for handling git protocol request (push/clone etc.) | |
7 | It's implemented with basic auth function |
|
7 | It's implemented with basic auth function | |
8 |
|
8 | |||
9 | :created_on: Apr 28, 2010 |
|
9 | :created_on: Apr 28, 2010 | |
10 | :author: marcink |
|
10 | :author: marcink | |
11 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> |
|
11 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> | |
12 | :license: GPLv3, see COPYING for more details. |
|
12 | :license: GPLv3, see COPYING for more details. | |
13 | """ |
|
13 | """ | |
14 | # This program is free software: you can redistribute it and/or modify |
|
14 | # This program is free software: you can redistribute it and/or modify | |
15 | # it under the terms of the GNU General Public License as published by |
|
15 | # it under the terms of the GNU General Public License as published by | |
16 | # the Free Software Foundation, either version 3 of the License, or |
|
16 | # the Free Software Foundation, either version 3 of the License, or | |
17 | # (at your option) any later version. |
|
17 | # (at your option) any later version. | |
18 | # |
|
18 | # | |
19 | # This program is distributed in the hope that it will be useful, |
|
19 | # This program is distributed in the hope that it will be useful, | |
20 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
20 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
21 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
21 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
22 | # GNU General Public License for more details. |
|
22 | # GNU General Public License for more details. | |
23 | # |
|
23 | # | |
24 | # You should have received a copy of the GNU General Public License |
|
24 | # You should have received a copy of the GNU General Public License | |
25 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
25 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
26 |
|
26 | |||
27 | import os |
|
27 | import os | |
28 | import re |
|
28 | import re | |
29 | import logging |
|
29 | import logging | |
30 | import traceback |
|
30 | import traceback | |
31 |
|
31 | |||
32 | from dulwich import server as dulserver |
|
32 | from dulwich import server as dulserver | |
33 |
|
33 | |||
34 |
|
34 | |||
35 | class SimpleGitUploadPackHandler(dulserver.UploadPackHandler): |
|
35 | class SimpleGitUploadPackHandler(dulserver.UploadPackHandler): | |
36 |
|
36 | |||
37 | def handle(self): |
|
37 | def handle(self): | |
38 | write = lambda x: self.proto.write_sideband(1, x) |
|
38 | write = lambda x: self.proto.write_sideband(1, x) | |
39 |
|
39 | |||
40 | graph_walker = dulserver.ProtocolGraphWalker(self, |
|
40 | graph_walker = dulserver.ProtocolGraphWalker(self, | |
41 | self.repo.object_store, |
|
41 | self.repo.object_store, | |
42 | self.repo.get_peeled) |
|
42 | self.repo.get_peeled) | |
43 | objects_iter = self.repo.fetch_objects( |
|
43 | objects_iter = self.repo.fetch_objects( | |
44 | graph_walker.determine_wants, graph_walker, self.progress, |
|
44 | graph_walker.determine_wants, graph_walker, self.progress, | |
45 | get_tagged=self.get_tagged) |
|
45 | get_tagged=self.get_tagged) | |
46 |
|
46 | |||
47 | # Do they want any objects? |
|
47 | # Do they want any objects? | |
48 | if objects_iter is None or len(objects_iter) == 0: |
|
48 | if objects_iter is None or len(objects_iter) == 0: | |
49 | return |
|
49 | return | |
50 |
|
50 | |||
51 | self.progress("counting objects: %d, done.\n" % len(objects_iter)) |
|
51 | self.progress("counting objects: %d, done.\n" % len(objects_iter)) | |
52 | dulserver.write_pack_objects(dulserver.ProtocolFile(None, write), |
|
52 | dulserver.write_pack_objects(dulserver.ProtocolFile(None, write), | |
53 | objects_iter, len(objects_iter)) |
|
53 | objects_iter, len(objects_iter)) | |
54 | messages = [] |
|
54 | messages = [] | |
55 | messages.append('thank you for using rhodecode') |
|
55 | messages.append('thank you for using rhodecode') | |
56 |
|
56 | |||
57 | for msg in messages: |
|
57 | for msg in messages: | |
58 | self.progress(msg + "\n") |
|
58 | self.progress(msg + "\n") | |
59 | # we are done |
|
59 | # we are done | |
60 | self.proto.write("0000") |
|
60 | self.proto.write("0000") | |
61 |
|
61 | |||
62 | dulserver.DEFAULT_HANDLERS = { |
|
62 | dulserver.DEFAULT_HANDLERS = { | |
63 | 'git-upload-pack': SimpleGitUploadPackHandler, |
|
63 | 'git-upload-pack': SimpleGitUploadPackHandler, | |
64 | 'git-receive-pack': dulserver.ReceivePackHandler, |
|
64 | 'git-receive-pack': dulserver.ReceivePackHandler, | |
65 | } |
|
65 | } | |
66 |
|
66 | |||
67 | from dulwich.repo import Repo |
|
67 | from dulwich.repo import Repo | |
68 | from dulwich.web import HTTPGitApplication |
|
68 | from dulwich.web import HTTPGitApplication | |
69 |
|
69 | |||
70 | from paste.httpheaders import REMOTE_USER, AUTH_TYPE |
|
70 | from paste.httpheaders import REMOTE_USER, AUTH_TYPE | |
71 |
|
71 | |||
72 | from rhodecode.lib import safe_str |
|
72 | from rhodecode.lib import safe_str | |
73 | from rhodecode.lib.base import BaseVCSController |
|
73 | from rhodecode.lib.base import BaseVCSController | |
74 | from rhodecode.lib.auth import get_container_username |
|
74 | from rhodecode.lib.auth import get_container_username | |
75 | from rhodecode.lib.utils import is_valid_repo |
|
75 | from rhodecode.lib.utils import is_valid_repo | |
76 | from rhodecode.model.db import User |
|
76 | from rhodecode.model.db import User | |
77 |
|
77 | |||
78 | from webob.exc import HTTPNotFound, HTTPForbidden, HTTPInternalServerError |
|
78 | from webob.exc import HTTPNotFound, HTTPForbidden, HTTPInternalServerError | |
79 |
|
79 | |||
80 | log = logging.getLogger(__name__) |
|
80 | log = logging.getLogger(__name__) | |
81 |
|
81 | |||
82 |
|
82 | |||
83 | GIT_PROTO_PAT = re.compile(r'^/(.+)/(info/refs|git-upload-pack|git-receive-pack)') |
|
83 | GIT_PROTO_PAT = re.compile(r'^/(.+)/(info/refs|git-upload-pack|git-receive-pack)') | |
84 |
|
84 | |||
85 |
|
85 | |||
86 | def is_git(environ): |
|
86 | def is_git(environ): | |
87 | path_info = environ['PATH_INFO'] |
|
87 | path_info = environ['PATH_INFO'] | |
88 | isgit_path = GIT_PROTO_PAT.match(path_info) |
|
88 | isgit_path = GIT_PROTO_PAT.match(path_info) | |
89 | log.debug('is a git path %s pathinfo : %s' % (isgit_path, path_info)) |
|
89 | log.debug('is a git path %s pathinfo : %s' % (isgit_path, path_info)) | |
90 | return isgit_path |
|
90 | return isgit_path | |
91 |
|
91 | |||
92 |
|
92 | |||
93 | class SimpleGit(BaseVCSController): |
|
93 | class SimpleGit(BaseVCSController): | |
94 |
|
94 | |||
95 | def _handle_request(self, environ, start_response): |
|
95 | def _handle_request(self, environ, start_response): | |
96 |
|
96 | |||
97 | if not is_git(environ): |
|
97 | if not is_git(environ): | |
98 | return self.application(environ, start_response) |
|
98 | return self.application(environ, start_response) | |
99 |
|
99 | |||
100 | proxy_key = 'HTTP_X_REAL_IP' |
|
100 | proxy_key = 'HTTP_X_REAL_IP' | |
101 | def_key = 'REMOTE_ADDR' |
|
101 | def_key = 'REMOTE_ADDR' | |
102 | ipaddr = environ.get(proxy_key, environ.get(def_key, '0.0.0.0')) |
|
102 | ipaddr = environ.get(proxy_key, environ.get(def_key, '0.0.0.0')) | |
103 | username = None |
|
103 | username = None | |
104 | # skip passing error to error controller |
|
104 | # skip passing error to error controller | |
105 | environ['pylons.status_code_redirect'] = True |
|
105 | environ['pylons.status_code_redirect'] = True | |
106 |
|
106 | |||
107 | #====================================================================== |
|
107 | #====================================================================== | |
108 | # EXTRACT REPOSITORY NAME FROM ENV |
|
108 | # EXTRACT REPOSITORY NAME FROM ENV | |
109 | #====================================================================== |
|
109 | #====================================================================== | |
110 | try: |
|
110 | try: | |
111 | repo_name = self.__get_repository(environ) |
|
111 | repo_name = self.__get_repository(environ) | |
112 | log.debug('Extracted repo name is %s' % repo_name) |
|
112 | log.debug('Extracted repo name is %s' % repo_name) | |
113 | except: |
|
113 | except: | |
114 | return HTTPInternalServerError()(environ, start_response) |
|
114 | return HTTPInternalServerError()(environ, start_response) | |
115 |
|
115 | |||
116 | #====================================================================== |
|
116 | #====================================================================== | |
117 | # GET ACTION PULL or PUSH |
|
117 | # GET ACTION PULL or PUSH | |
118 | #====================================================================== |
|
118 | #====================================================================== | |
119 | action = self.__get_action(environ) |
|
119 | action = self.__get_action(environ) | |
120 |
|
120 | |||
121 | #====================================================================== |
|
121 | #====================================================================== | |
122 | # CHECK ANONYMOUS PERMISSION |
|
122 | # CHECK ANONYMOUS PERMISSION | |
123 | #====================================================================== |
|
123 | #====================================================================== | |
|
124 | ||||
124 | if action in ['pull', 'push']: |
|
125 | if action in ['pull', 'push']: | |
125 | anonymous_user = self.__get_user('default') |
|
126 | anonymous_user = self.__get_user('default') | |
126 | username = anonymous_user.username |
|
127 | username = anonymous_user.username | |
127 | anonymous_perm = self._check_permission(action, anonymous_user, |
|
128 | anonymous_perm = self._check_permission(action, anonymous_user, | |
128 | repo_name) |
|
129 | repo_name) | |
129 |
|
130 | |||
130 | if anonymous_perm is not True or anonymous_user.active is False: |
|
131 | if anonymous_perm is not True or anonymous_user.active is False: | |
131 | if anonymous_perm is not True: |
|
132 | if anonymous_perm is not True: | |
132 | log.debug('Not enough credentials to access this ' |
|
133 | log.debug('Not enough credentials to access this ' | |
133 | 'repository as anonymous user') |
|
134 | 'repository as anonymous user') | |
134 | if anonymous_user.active is False: |
|
135 | if anonymous_user.active is False: | |
135 | log.debug('Anonymous access is disabled, running ' |
|
136 | log.debug('Anonymous access is disabled, running ' | |
136 | 'authentication') |
|
137 | 'authentication') | |
137 | #============================================================== |
|
138 | #============================================================== | |
138 | # DEFAULT PERM FAILED OR ANONYMOUS ACCESS IS DISABLED SO WE |
|
139 | # DEFAULT PERM FAILED OR ANONYMOUS ACCESS IS DISABLED SO WE | |
139 | # NEED TO AUTHENTICATE AND ASK FOR AUTH USER PERMISSIONS |
|
140 | # NEED TO AUTHENTICATE AND ASK FOR AUTH USER PERMISSIONS | |
140 | #============================================================== |
|
141 | #============================================================== | |
141 |
|
142 | |||
142 | # Attempting to retrieve username from the container |
|
143 | # Attempting to retrieve username from the container | |
143 | username = get_container_username(environ, self.config) |
|
144 | username = get_container_username(environ, self.config) | |
144 |
|
145 | |||
145 | # If not authenticated by the container, running basic auth |
|
146 | # If not authenticated by the container, running basic auth | |
146 | if not username: |
|
147 | if not username: | |
147 | self.authenticate.realm = \ |
|
148 | self.authenticate.realm = \ | |
148 | safe_str(self.config['rhodecode_realm']) |
|
149 | safe_str(self.config['rhodecode_realm']) | |
149 | result = self.authenticate(environ) |
|
150 | result = self.authenticate(environ) | |
150 | if isinstance(result, str): |
|
151 | if isinstance(result, str): | |
151 | AUTH_TYPE.update(environ, 'basic') |
|
152 | AUTH_TYPE.update(environ, 'basic') | |
152 | REMOTE_USER.update(environ, result) |
|
153 | REMOTE_USER.update(environ, result) | |
153 | username = result |
|
154 | username = result | |
154 | else: |
|
155 | else: | |
155 | return result.wsgi_application(environ, start_response) |
|
156 | return result.wsgi_application(environ, start_response) | |
156 |
|
157 | |||
157 | #============================================================== |
|
158 | #============================================================== | |
158 | # CHECK PERMISSIONS FOR THIS REQUEST USING GIVEN USERNAME |
|
159 | # CHECK PERMISSIONS FOR THIS REQUEST USING GIVEN USERNAME | |
159 | #============================================================== |
|
160 | #============================================================== | |
160 | if action in ['pull', 'push']: |
|
161 | if action in ['pull', 'push']: | |
161 | try: |
|
162 | try: | |
162 | user = self.__get_user(username) |
|
163 | user = self.__get_user(username) | |
163 | if user is None or not user.active: |
|
164 | if user is None or not user.active: | |
164 | return HTTPForbidden()(environ, start_response) |
|
165 | return HTTPForbidden()(environ, start_response) | |
165 | username = user.username |
|
166 | username = user.username | |
166 | except: |
|
167 | except: | |
167 | log.error(traceback.format_exc()) |
|
168 | log.error(traceback.format_exc()) | |
168 | return HTTPInternalServerError()(environ, |
|
169 | return HTTPInternalServerError()(environ, | |
169 | start_response) |
|
170 | start_response) | |
170 |
|
171 | |||
171 | #check permissions for this repository |
|
172 | #check permissions for this repository | |
172 | perm = self._check_permission(action, user, |
|
173 | perm = self._check_permission(action, user, repo_name) | |
173 | repo_name) |
|
|||
174 | if perm is not True: |
|
174 | if perm is not True: | |
175 | return HTTPForbidden()(environ, start_response) |
|
175 | return HTTPForbidden()(environ, start_response) | |
176 |
|
176 | |||
177 | #=================================================================== |
|
177 | #=================================================================== | |
178 | # GIT REQUEST HANDLING |
|
178 | # GIT REQUEST HANDLING | |
179 | #=================================================================== |
|
179 | #=================================================================== | |
180 |
|
||||
181 | repo_path = safe_str(os.path.join(self.basepath, repo_name)) |
|
180 | repo_path = safe_str(os.path.join(self.basepath, repo_name)) | |
182 | log.debug('Repository path is %s' % repo_path) |
|
181 | log.debug('Repository path is %s' % repo_path) | |
183 |
|
182 | |||
184 | # quick check if that dir exists... |
|
183 | # quick check if that dir exists... | |
185 | if is_valid_repo(repo_name, self.basepath) is False: |
|
184 | if is_valid_repo(repo_name, self.basepath) is False: | |
186 | return HTTPNotFound()(environ, start_response) |
|
185 | return HTTPNotFound()(environ, start_response) | |
187 |
|
186 | |||
188 | try: |
|
187 | try: | |
189 | #invalidate cache on push |
|
188 | #invalidate cache on push | |
190 | if action == 'push': |
|
189 | if action == 'push': | |
191 | self._invalidate_cache(repo_name) |
|
190 | self._invalidate_cache(repo_name) | |
192 | log.info('%s action on GIT repo "%s"' % (action, repo_name)) |
|
191 | log.info('%s action on GIT repo "%s"' % (action, repo_name)) | |
193 | app = self.__make_app(repo_name, repo_path) |
|
192 | app = self.__make_app(repo_name, repo_path) | |
194 | return app(environ, start_response) |
|
193 | return app(environ, start_response) | |
195 | except Exception: |
|
194 | except Exception: | |
196 | log.error(traceback.format_exc()) |
|
195 | log.error(traceback.format_exc()) | |
197 | return HTTPInternalServerError()(environ, start_response) |
|
196 | return HTTPInternalServerError()(environ, start_response) | |
198 |
|
197 | |||
199 | def __make_app(self, repo_name, repo_path): |
|
198 | def __make_app(self, repo_name, repo_path): | |
200 | """ |
|
199 | """ | |
201 | Make an wsgi application using dulserver |
|
200 | Make an wsgi application using dulserver | |
202 |
|
201 | |||
203 | :param repo_name: name of the repository |
|
202 | :param repo_name: name of the repository | |
204 | :param repo_path: full path to the repository |
|
203 | :param repo_path: full path to the repository | |
205 | """ |
|
204 | """ | |
206 |
|
||||
207 | _d = {'/' + repo_name: Repo(repo_path)} |
|
205 | _d = {'/' + repo_name: Repo(repo_path)} | |
208 | backend = dulserver.DictBackend(_d) |
|
206 | backend = dulserver.DictBackend(_d) | |
209 | gitserve = HTTPGitApplication(backend) |
|
207 | gitserve = HTTPGitApplication(backend) | |
210 |
|
208 | |||
211 | return gitserve |
|
209 | return gitserve | |
212 |
|
210 | |||
213 | def __get_repository(self, environ): |
|
211 | def __get_repository(self, environ): | |
214 | """ |
|
212 | """ | |
215 | Get's repository name out of PATH_INFO header |
|
213 | Get's repository name out of PATH_INFO header | |
216 |
|
214 | |||
217 | :param environ: environ where PATH_INFO is stored |
|
215 | :param environ: environ where PATH_INFO is stored | |
218 | """ |
|
216 | """ | |
219 | try: |
|
217 | try: | |
220 | environ['PATH_INFO'] = self._get_by_id(environ['PATH_INFO']) |
|
218 | environ['PATH_INFO'] = self._get_by_id(environ['PATH_INFO']) | |
221 | repo_name = GIT_PROTO_PAT.match(environ['PATH_INFO']).group(1) |
|
219 | repo_name = GIT_PROTO_PAT.match(environ['PATH_INFO']).group(1) | |
222 | except: |
|
220 | except: | |
223 | log.error(traceback.format_exc()) |
|
221 | log.error(traceback.format_exc()) | |
224 | raise |
|
222 | raise | |
225 |
|
223 | |||
226 | return repo_name |
|
224 | return repo_name | |
227 |
|
225 | |||
228 | def __get_user(self, username): |
|
226 | def __get_user(self, username): | |
229 | return User.get_by_username(username) |
|
227 | return User.get_by_username(username) | |
230 |
|
228 | |||
231 | def __get_action(self, environ): |
|
229 | def __get_action(self, environ): | |
232 | """Maps git request commands into a pull or push command. |
|
230 | """ | |
|
231 | Maps git request commands into a pull or push command. | |||
233 |
|
232 | |||
234 | :param environ: |
|
233 | :param environ: | |
235 | """ |
|
234 | """ | |
236 | service = environ['QUERY_STRING'].split('=') |
|
235 | service = environ['QUERY_STRING'].split('=') | |
|
236 | ||||
237 | if len(service) > 1: |
|
237 | if len(service) > 1: | |
238 | service_cmd = service[1] |
|
238 | service_cmd = service[1] | |
239 | mapping = { |
|
239 | mapping = { | |
240 | 'git-receive-pack': 'push', |
|
240 | 'git-receive-pack': 'push', | |
241 | 'git-upload-pack': 'pull', |
|
241 | 'git-upload-pack': 'pull', | |
242 | } |
|
242 | } | |
243 |
|
243 | op = mapping[service_cmd] | ||
244 | return mapping.get(service_cmd, |
|
244 | self._git_stored_op = op | |
245 | service_cmd if service_cmd else 'other') |
|
245 | return op | |
246 | else: |
|
246 | else: | |
247 | return 'other' |
|
247 | # try to fallback to stored variable as we don't know if the last | |
|
248 | # operation is pull/push | |||
|
249 | op = getattr(self, '_git_stored_op', 'pull') | |||
|
250 | return op |
@@ -1,451 +1,451 b'' | |||||
1 | import re |
|
1 | import re | |
2 | from itertools import chain |
|
2 | from itertools import chain | |
3 | from dulwich import objects |
|
3 | from dulwich import objects | |
4 | from subprocess import Popen, PIPE |
|
4 | from subprocess import Popen, PIPE | |
5 | from rhodecode.lib.vcs.conf import settings |
|
5 | from rhodecode.lib.vcs.conf import settings | |
6 | from rhodecode.lib.vcs.exceptions import RepositoryError |
|
6 | from rhodecode.lib.vcs.exceptions import RepositoryError | |
7 | from rhodecode.lib.vcs.exceptions import ChangesetError |
|
7 | from rhodecode.lib.vcs.exceptions import ChangesetError | |
8 | from rhodecode.lib.vcs.exceptions import NodeDoesNotExistError |
|
8 | from rhodecode.lib.vcs.exceptions import NodeDoesNotExistError | |
9 | from rhodecode.lib.vcs.exceptions import VCSError |
|
9 | from rhodecode.lib.vcs.exceptions import VCSError | |
10 | from rhodecode.lib.vcs.exceptions import ChangesetDoesNotExistError |
|
10 | from rhodecode.lib.vcs.exceptions import ChangesetDoesNotExistError | |
11 | from rhodecode.lib.vcs.exceptions import ImproperArchiveTypeError |
|
11 | from rhodecode.lib.vcs.exceptions import ImproperArchiveTypeError | |
12 | from rhodecode.lib.vcs.backends.base import BaseChangeset |
|
12 | from rhodecode.lib.vcs.backends.base import BaseChangeset | |
13 | from rhodecode.lib.vcs.nodes import FileNode, DirNode, NodeKind, RootNode, RemovedFileNode |
|
13 | from rhodecode.lib.vcs.nodes import FileNode, DirNode, NodeKind, RootNode, RemovedFileNode | |
14 | from rhodecode.lib.vcs.utils import safe_unicode |
|
14 | from rhodecode.lib.vcs.utils import safe_unicode | |
15 | from rhodecode.lib.vcs.utils import date_fromtimestamp |
|
15 | from rhodecode.lib.vcs.utils import date_fromtimestamp | |
16 | from rhodecode.lib.vcs.utils.lazy import LazyProperty |
|
16 | from rhodecode.lib.vcs.utils.lazy import LazyProperty | |
17 |
|
17 | |||
18 |
|
18 | |||
19 | class GitChangeset(BaseChangeset): |
|
19 | class GitChangeset(BaseChangeset): | |
20 | """ |
|
20 | """ | |
21 | Represents state of the repository at single revision. |
|
21 | Represents state of the repository at single revision. | |
22 | """ |
|
22 | """ | |
23 |
|
23 | |||
24 | def __init__(self, repository, revision): |
|
24 | def __init__(self, repository, revision): | |
25 | self._stat_modes = {} |
|
25 | self._stat_modes = {} | |
26 | self.repository = repository |
|
26 | self.repository = repository | |
27 | self.raw_id = revision |
|
27 | self.raw_id = revision | |
28 | self.revision = repository.revisions.index(revision) |
|
28 | self.revision = repository.revisions.index(revision) | |
29 |
|
29 | |||
30 | self.short_id = self.raw_id[:12] |
|
30 | self.short_id = self.raw_id[:12] | |
31 | self.id = self.raw_id |
|
31 | self.id = self.raw_id | |
32 | try: |
|
32 | try: | |
33 | commit = self.repository._repo.get_object(self.raw_id) |
|
33 | commit = self.repository._repo.get_object(self.raw_id) | |
34 | except KeyError: |
|
34 | except KeyError: | |
35 | raise RepositoryError("Cannot get object with id %s" % self.raw_id) |
|
35 | raise RepositoryError("Cannot get object with id %s" % self.raw_id) | |
36 | self._commit = commit |
|
36 | self._commit = commit | |
37 | self._tree_id = commit.tree |
|
37 | self._tree_id = commit.tree | |
38 |
|
38 | |||
39 | try: |
|
39 | try: | |
40 | self.message = safe_unicode(commit.message[:-1]) |
|
40 | self.message = safe_unicode(commit.message[:-1]) | |
41 | # Always strip last eol |
|
41 | # Always strip last eol | |
42 | except UnicodeDecodeError: |
|
42 | except UnicodeDecodeError: | |
43 | self.message = commit.message[:-1].decode(commit.encoding |
|
43 | self.message = commit.message[:-1].decode(commit.encoding | |
44 | or 'utf-8') |
|
44 | or 'utf-8') | |
45 | #self.branch = None |
|
45 | #self.branch = None | |
46 | self.tags = [] |
|
46 | self.tags = [] | |
47 | #tree = self.repository.get_object(self._tree_id) |
|
47 | #tree = self.repository.get_object(self._tree_id) | |
48 | self.nodes = {} |
|
48 | self.nodes = {} | |
49 | self._paths = {} |
|
49 | self._paths = {} | |
50 |
|
50 | |||
51 | @LazyProperty |
|
51 | @LazyProperty | |
52 | def author(self): |
|
52 | def author(self): | |
53 | return safe_unicode(self._commit.committer) |
|
53 | return safe_unicode(self._commit.committer) | |
54 |
|
54 | |||
55 | @LazyProperty |
|
55 | @LazyProperty | |
56 | def date(self): |
|
56 | def date(self): | |
57 | return date_fromtimestamp(self._commit.commit_time, |
|
57 | return date_fromtimestamp(self._commit.commit_time, | |
58 | self._commit.commit_timezone) |
|
58 | self._commit.commit_timezone) | |
59 |
|
59 | |||
60 | @LazyProperty |
|
60 | @LazyProperty | |
61 | def status(self): |
|
61 | def status(self): | |
62 | """ |
|
62 | """ | |
63 | Returns modified, added, removed, deleted files for current changeset |
|
63 | Returns modified, added, removed, deleted files for current changeset | |
64 | """ |
|
64 | """ | |
65 | return self.changed, self.added, self.removed |
|
65 | return self.changed, self.added, self.removed | |
66 |
|
66 | |||
67 | @LazyProperty |
|
67 | @LazyProperty | |
68 | def branch(self): |
|
68 | def branch(self): | |
69 | # TODO: Cache as we walk (id <-> branch name mapping) |
|
69 | # TODO: Cache as we walk (id <-> branch name mapping) | |
70 | refs = self.repository._repo.get_refs() |
|
70 | refs = self.repository._repo.get_refs() | |
71 | heads = [(key[len('refs/heads/'):], val) for key, val in refs.items() |
|
71 | heads = [(key[len('refs/heads/'):], val) for key, val in refs.items() | |
72 | if key.startswith('refs/heads/')] |
|
72 | if key.startswith('refs/heads/')] | |
73 |
|
73 | |||
74 | for name, id in heads: |
|
74 | for name, id in heads: | |
75 | walker = self.repository._repo.object_store.get_graph_walker([id]) |
|
75 | walker = self.repository._repo.object_store.get_graph_walker([id]) | |
76 | while True: |
|
76 | while True: | |
77 | id = walker.next() |
|
77 | id = walker.next() | |
78 | if not id: |
|
78 | if not id: | |
79 | break |
|
79 | break | |
80 | if id == self.id: |
|
80 | if id == self.id: | |
81 | return safe_unicode(name) |
|
81 | return safe_unicode(name) | |
82 | raise ChangesetError("This should not happen... Have you manually " |
|
82 | raise ChangesetError("This should not happen... Have you manually " | |
83 | "change id of the changeset?") |
|
83 | "change id of the changeset?") | |
84 |
|
84 | |||
85 | def _fix_path(self, path): |
|
85 | def _fix_path(self, path): | |
86 | """ |
|
86 | """ | |
87 | Paths are stored without trailing slash so we need to get rid off it if |
|
87 | Paths are stored without trailing slash so we need to get rid off it if | |
88 | needed. |
|
88 | needed. | |
89 | """ |
|
89 | """ | |
90 | if path.endswith('/'): |
|
90 | if path.endswith('/'): | |
91 | path = path.rstrip('/') |
|
91 | path = path.rstrip('/') | |
92 | return path |
|
92 | return path | |
93 |
|
93 | |||
94 | def _get_id_for_path(self, path): |
|
94 | def _get_id_for_path(self, path): | |
95 | # FIXME: Please, spare a couple of minutes and make those codes cleaner; |
|
95 | # FIXME: Please, spare a couple of minutes and make those codes cleaner; | |
96 | if not path in self._paths: |
|
96 | if not path in self._paths: | |
97 | path = path.strip('/') |
|
97 | path = path.strip('/') | |
98 | # set root tree |
|
98 | # set root tree | |
99 | tree = self.repository._repo[self._commit.tree] |
|
99 | tree = self.repository._repo[self._commit.tree] | |
100 | if path == '': |
|
100 | if path == '': | |
101 | self._paths[''] = tree.id |
|
101 | self._paths[''] = tree.id | |
102 | return tree.id |
|
102 | return tree.id | |
103 | splitted = path.split('/') |
|
103 | splitted = path.split('/') | |
104 | dirs, name = splitted[:-1], splitted[-1] |
|
104 | dirs, name = splitted[:-1], splitted[-1] | |
105 | curdir = '' |
|
105 | curdir = '' | |
106 | for dir in dirs: |
|
106 | for dir in dirs: | |
107 | if curdir: |
|
107 | if curdir: | |
108 | curdir = '/'.join((curdir, dir)) |
|
108 | curdir = '/'.join((curdir, dir)) | |
109 | else: |
|
109 | else: | |
110 | curdir = dir |
|
110 | curdir = dir | |
111 | #if curdir in self._paths: |
|
111 | #if curdir in self._paths: | |
112 | ## This path have been already traversed |
|
112 | ## This path have been already traversed | |
113 | ## Update tree and continue |
|
113 | ## Update tree and continue | |
114 | #tree = self.repository._repo[self._paths[curdir]] |
|
114 | #tree = self.repository._repo[self._paths[curdir]] | |
115 | #continue |
|
115 | #continue | |
116 | dir_id = None |
|
116 | dir_id = None | |
117 | for item, stat, id in tree.iteritems(): |
|
117 | for item, stat, id in tree.iteritems(): | |
118 | if curdir: |
|
118 | if curdir: | |
119 | item_path = '/'.join((curdir, item)) |
|
119 | item_path = '/'.join((curdir, item)) | |
120 | else: |
|
120 | else: | |
121 | item_path = item |
|
121 | item_path = item | |
122 | self._paths[item_path] = id |
|
122 | self._paths[item_path] = id | |
123 | self._stat_modes[item_path] = stat |
|
123 | self._stat_modes[item_path] = stat | |
124 | if dir == item: |
|
124 | if dir == item: | |
125 | dir_id = id |
|
125 | dir_id = id | |
126 | if dir_id: |
|
126 | if dir_id: | |
127 | # Update tree |
|
127 | # Update tree | |
128 | tree = self.repository._repo[dir_id] |
|
128 | tree = self.repository._repo[dir_id] | |
129 | if not isinstance(tree, objects.Tree): |
|
129 | if not isinstance(tree, objects.Tree): | |
130 | raise ChangesetError('%s is not a directory' % curdir) |
|
130 | raise ChangesetError('%s is not a directory' % curdir) | |
131 | else: |
|
131 | else: | |
132 | raise ChangesetError('%s have not been found' % curdir) |
|
132 | raise ChangesetError('%s have not been found' % curdir) | |
133 | for item, stat, id in tree.iteritems(): |
|
133 | for item, stat, id in tree.iteritems(): | |
134 | if curdir: |
|
134 | if curdir: | |
135 | name = '/'.join((curdir, item)) |
|
135 | name = '/'.join((curdir, item)) | |
136 | else: |
|
136 | else: | |
137 | name = item |
|
137 | name = item | |
138 | self._paths[name] = id |
|
138 | self._paths[name] = id | |
139 | self._stat_modes[name] = stat |
|
139 | self._stat_modes[name] = stat | |
140 | if not path in self._paths: |
|
140 | if not path in self._paths: | |
141 | raise NodeDoesNotExistError("There is no file nor directory " |
|
141 | raise NodeDoesNotExistError("There is no file nor directory " | |
142 | "at the given path %r at revision %r" |
|
142 | "at the given path %r at revision %r" | |
143 | % (path, self.short_id)) |
|
143 | % (path, self.short_id)) | |
144 | return self._paths[path] |
|
144 | return self._paths[path] | |
145 |
|
145 | |||
146 | def _get_kind(self, path): |
|
146 | def _get_kind(self, path): | |
147 | id = self._get_id_for_path(path) |
|
147 | id = self._get_id_for_path(path) | |
148 | obj = self.repository._repo[id] |
|
148 | obj = self.repository._repo[id] | |
149 | if isinstance(obj, objects.Blob): |
|
149 | if isinstance(obj, objects.Blob): | |
150 | return NodeKind.FILE |
|
150 | return NodeKind.FILE | |
151 | elif isinstance(obj, objects.Tree): |
|
151 | elif isinstance(obj, objects.Tree): | |
152 | return NodeKind.DIR |
|
152 | return NodeKind.DIR | |
153 |
|
153 | |||
154 | def _get_file_nodes(self): |
|
154 | def _get_file_nodes(self): | |
155 | return chain(*(t[2] for t in self.walk())) |
|
155 | return chain(*(t[2] for t in self.walk())) | |
156 |
|
156 | |||
157 | @LazyProperty |
|
157 | @LazyProperty | |
158 | def parents(self): |
|
158 | def parents(self): | |
159 | """ |
|
159 | """ | |
160 | Returns list of parents changesets. |
|
160 | Returns list of parents changesets. | |
161 | """ |
|
161 | """ | |
162 | return [self.repository.get_changeset(parent) |
|
162 | return [self.repository.get_changeset(parent) | |
163 | for parent in self._commit.parents] |
|
163 | for parent in self._commit.parents] | |
164 |
|
164 | |||
165 | def next(self, branch=None): |
|
165 | def next(self, branch=None): | |
166 |
|
166 | |||
167 | if branch and self.branch != branch: |
|
167 | if branch and self.branch != branch: | |
168 | raise VCSError('Branch option used on changeset not belonging ' |
|
168 | raise VCSError('Branch option used on changeset not belonging ' | |
169 | 'to that branch') |
|
169 | 'to that branch') | |
170 |
|
170 | |||
171 | def _next(changeset, branch): |
|
171 | def _next(changeset, branch): | |
172 | try: |
|
172 | try: | |
173 | next_ = changeset.revision + 1 |
|
173 | next_ = changeset.revision + 1 | |
174 | next_rev = changeset.repository.revisions[next_] |
|
174 | next_rev = changeset.repository.revisions[next_] | |
175 | except IndexError: |
|
175 | except IndexError: | |
176 | raise ChangesetDoesNotExistError |
|
176 | raise ChangesetDoesNotExistError | |
177 | cs = changeset.repository.get_changeset(next_rev) |
|
177 | cs = changeset.repository.get_changeset(next_rev) | |
178 |
|
178 | |||
179 | if branch and branch != cs.branch: |
|
179 | if branch and branch != cs.branch: | |
180 | return _next(cs, branch) |
|
180 | return _next(cs, branch) | |
181 |
|
181 | |||
182 | return cs |
|
182 | return cs | |
183 |
|
183 | |||
184 | return _next(self, branch) |
|
184 | return _next(self, branch) | |
185 |
|
185 | |||
186 | def prev(self, branch=None): |
|
186 | def prev(self, branch=None): | |
187 | if branch and self.branch != branch: |
|
187 | if branch and self.branch != branch: | |
188 | raise VCSError('Branch option used on changeset not belonging ' |
|
188 | raise VCSError('Branch option used on changeset not belonging ' | |
189 | 'to that branch') |
|
189 | 'to that branch') | |
190 |
|
190 | |||
191 | def _prev(changeset, branch): |
|
191 | def _prev(changeset, branch): | |
192 | try: |
|
192 | try: | |
193 | prev_ = changeset.revision - 1 |
|
193 | prev_ = changeset.revision - 1 | |
194 | if prev_ < 0: |
|
194 | if prev_ < 0: | |
195 | raise IndexError |
|
195 | raise IndexError | |
196 | prev_rev = changeset.repository.revisions[prev_] |
|
196 | prev_rev = changeset.repository.revisions[prev_] | |
197 | except IndexError: |
|
197 | except IndexError: | |
198 | raise ChangesetDoesNotExistError |
|
198 | raise ChangesetDoesNotExistError | |
199 |
|
199 | |||
200 | cs = changeset.repository.get_changeset(prev_rev) |
|
200 | cs = changeset.repository.get_changeset(prev_rev) | |
201 |
|
201 | |||
202 | if branch and branch != cs.branch: |
|
202 | if branch and branch != cs.branch: | |
203 | return _prev(cs, branch) |
|
203 | return _prev(cs, branch) | |
204 |
|
204 | |||
205 | return cs |
|
205 | return cs | |
206 |
|
206 | |||
207 | return _prev(self, branch) |
|
207 | return _prev(self, branch) | |
208 |
|
208 | |||
209 | def get_file_mode(self, path): |
|
209 | def get_file_mode(self, path): | |
210 | """ |
|
210 | """ | |
211 | Returns stat mode of the file at the given ``path``. |
|
211 | Returns stat mode of the file at the given ``path``. | |
212 | """ |
|
212 | """ | |
213 | # ensure path is traversed |
|
213 | # ensure path is traversed | |
214 | self._get_id_for_path(path) |
|
214 | self._get_id_for_path(path) | |
215 | return self._stat_modes[path] |
|
215 | return self._stat_modes[path] | |
216 |
|
216 | |||
217 | def get_file_content(self, path): |
|
217 | def get_file_content(self, path): | |
218 | """ |
|
218 | """ | |
219 | Returns content of the file at given ``path``. |
|
219 | Returns content of the file at given ``path``. | |
220 | """ |
|
220 | """ | |
221 | id = self._get_id_for_path(path) |
|
221 | id = self._get_id_for_path(path) | |
222 | blob = self.repository._repo[id] |
|
222 | blob = self.repository._repo[id] | |
223 | return blob.as_pretty_string() |
|
223 | return blob.as_pretty_string() | |
224 |
|
224 | |||
225 | def get_file_size(self, path): |
|
225 | def get_file_size(self, path): | |
226 | """ |
|
226 | """ | |
227 | Returns size of the file at given ``path``. |
|
227 | Returns size of the file at given ``path``. | |
228 | """ |
|
228 | """ | |
229 | id = self._get_id_for_path(path) |
|
229 | id = self._get_id_for_path(path) | |
230 | blob = self.repository._repo[id] |
|
230 | blob = self.repository._repo[id] | |
231 | return blob.raw_length() |
|
231 | return blob.raw_length() | |
232 |
|
232 | |||
233 | def get_file_changeset(self, path): |
|
233 | def get_file_changeset(self, path): | |
234 | """ |
|
234 | """ | |
235 | Returns last commit of the file at the given ``path``. |
|
235 | Returns last commit of the file at the given ``path``. | |
236 | """ |
|
236 | """ | |
237 | node = self.get_node(path) |
|
237 | node = self.get_node(path) | |
238 | return node.history[0] |
|
238 | return node.history[0] | |
239 |
|
239 | |||
240 | def get_file_history(self, path): |
|
240 | def get_file_history(self, path): | |
241 | """ |
|
241 | """ | |
242 | Returns history of file as reversed list of ``Changeset`` objects for |
|
242 | Returns history of file as reversed list of ``Changeset`` objects for | |
243 | which file at given ``path`` has been modified. |
|
243 | which file at given ``path`` has been modified. | |
244 |
|
244 | |||
245 | TODO: This function now uses os underlying 'git' and 'grep' commands |
|
245 | TODO: This function now uses os underlying 'git' and 'grep' commands | |
246 | which is generally not good. Should be replaced with algorithm |
|
246 | which is generally not good. Should be replaced with algorithm | |
247 | iterating commits. |
|
247 | iterating commits. | |
248 | """ |
|
248 | """ | |
249 | cmd = 'log --pretty="format: %%H" --name-status -p %s -- "%s"' % ( |
|
249 | cmd = 'log --pretty="format: %%H" --name-status -p %s -- "%s"' % ( | |
250 |
|
|
250 | self.id, path | |
251 | ) |
|
251 | ) | |
252 | so, se = self.repository.run_git_command(cmd) |
|
252 | so, se = self.repository.run_git_command(cmd) | |
253 | ids = re.findall(r'\w{40}', so) |
|
253 | ids = re.findall(r'\w{40}', so) | |
254 | return [self.repository.get_changeset(id) for id in ids] |
|
254 | return [self.repository.get_changeset(id) for id in ids] | |
255 |
|
255 | |||
256 | def get_file_annotate(self, path): |
|
256 | def get_file_annotate(self, path): | |
257 | """ |
|
257 | """ | |
258 | Returns a list of three element tuples with lineno,changeset and line |
|
258 | Returns a list of three element tuples with lineno,changeset and line | |
259 |
|
259 | |||
260 | TODO: This function now uses os underlying 'git' command which is |
|
260 | TODO: This function now uses os underlying 'git' command which is | |
261 | generally not good. Should be replaced with algorithm iterating |
|
261 | generally not good. Should be replaced with algorithm iterating | |
262 | commits. |
|
262 | commits. | |
263 | """ |
|
263 | """ | |
264 | cmd = 'blame -l --root -r %s -- "%s"' % (self.id, path) |
|
264 | cmd = 'blame -l --root -r %s -- "%s"' % (self.id, path) | |
265 | # -l ==> outputs long shas (and we need all 40 characters) |
|
265 | # -l ==> outputs long shas (and we need all 40 characters) | |
266 | # --root ==> doesn't put '^' character for bounderies |
|
266 | # --root ==> doesn't put '^' character for bounderies | |
267 | # -r sha ==> blames for the given revision |
|
267 | # -r sha ==> blames for the given revision | |
268 | so, se = self.repository.run_git_command(cmd) |
|
268 | so, se = self.repository.run_git_command(cmd) | |
269 | annotate = [] |
|
269 | annotate = [] | |
270 | for i, blame_line in enumerate(so.split('\n')[:-1]): |
|
270 | for i, blame_line in enumerate(so.split('\n')[:-1]): | |
271 | ln_no = i + 1 |
|
271 | ln_no = i + 1 | |
272 | id, line = re.split(r' \(.+?\) ', blame_line, 1) |
|
272 | id, line = re.split(r' \(.+?\) ', blame_line, 1) | |
273 | annotate.append((ln_no, self.repository.get_changeset(id), line)) |
|
273 | annotate.append((ln_no, self.repository.get_changeset(id), line)) | |
274 | return annotate |
|
274 | return annotate | |
275 |
|
275 | |||
276 | def fill_archive(self, stream=None, kind='tgz', prefix=None, |
|
276 | def fill_archive(self, stream=None, kind='tgz', prefix=None, | |
277 | subrepos=False): |
|
277 | subrepos=False): | |
278 | """ |
|
278 | """ | |
279 | Fills up given stream. |
|
279 | Fills up given stream. | |
280 |
|
280 | |||
281 | :param stream: file like object. |
|
281 | :param stream: file like object. | |
282 | :param kind: one of following: ``zip``, ``tgz`` or ``tbz2``. |
|
282 | :param kind: one of following: ``zip``, ``tgz`` or ``tbz2``. | |
283 | Default: ``tgz``. |
|
283 | Default: ``tgz``. | |
284 | :param prefix: name of root directory in archive. |
|
284 | :param prefix: name of root directory in archive. | |
285 | Default is repository name and changeset's raw_id joined with dash |
|
285 | Default is repository name and changeset's raw_id joined with dash | |
286 | (``repo-tip.<KIND>``). |
|
286 | (``repo-tip.<KIND>``). | |
287 | :param subrepos: include subrepos in this archive. |
|
287 | :param subrepos: include subrepos in this archive. | |
288 |
|
288 | |||
289 | :raise ImproperArchiveTypeError: If given kind is wrong. |
|
289 | :raise ImproperArchiveTypeError: If given kind is wrong. | |
290 | :raise VcsError: If given stream is None |
|
290 | :raise VcsError: If given stream is None | |
291 |
|
291 | |||
292 | """ |
|
292 | """ | |
293 | allowed_kinds = settings.ARCHIVE_SPECS.keys() |
|
293 | allowed_kinds = settings.ARCHIVE_SPECS.keys() | |
294 | if kind not in allowed_kinds: |
|
294 | if kind not in allowed_kinds: | |
295 | raise ImproperArchiveTypeError('Archive kind not supported use one' |
|
295 | raise ImproperArchiveTypeError('Archive kind not supported use one' | |
296 | 'of %s', allowed_kinds) |
|
296 | 'of %s', allowed_kinds) | |
297 |
|
297 | |||
298 | if prefix is None: |
|
298 | if prefix is None: | |
299 | prefix = '%s-%s' % (self.repository.name, self.short_id) |
|
299 | prefix = '%s-%s' % (self.repository.name, self.short_id) | |
300 | elif prefix.startswith('/'): |
|
300 | elif prefix.startswith('/'): | |
301 | raise VCSError("Prefix cannot start with leading slash") |
|
301 | raise VCSError("Prefix cannot start with leading slash") | |
302 | elif prefix.strip() == '': |
|
302 | elif prefix.strip() == '': | |
303 | raise VCSError("Prefix cannot be empty") |
|
303 | raise VCSError("Prefix cannot be empty") | |
304 |
|
304 | |||
305 | if kind == 'zip': |
|
305 | if kind == 'zip': | |
306 | frmt = 'zip' |
|
306 | frmt = 'zip' | |
307 | else: |
|
307 | else: | |
308 | frmt = 'tar' |
|
308 | frmt = 'tar' | |
309 | cmd = 'git archive --format=%s --prefix=%s/ %s' % (frmt, prefix, |
|
309 | cmd = 'git archive --format=%s --prefix=%s/ %s' % (frmt, prefix, | |
310 | self.raw_id) |
|
310 | self.raw_id) | |
311 | if kind == 'tgz': |
|
311 | if kind == 'tgz': | |
312 | cmd += ' | gzip -9' |
|
312 | cmd += ' | gzip -9' | |
313 | elif kind == 'tbz2': |
|
313 | elif kind == 'tbz2': | |
314 | cmd += ' | bzip2 -9' |
|
314 | cmd += ' | bzip2 -9' | |
315 |
|
315 | |||
316 | if stream is None: |
|
316 | if stream is None: | |
317 | raise VCSError('You need to pass in a valid stream for filling' |
|
317 | raise VCSError('You need to pass in a valid stream for filling' | |
318 | ' with archival data') |
|
318 | ' with archival data') | |
319 | popen = Popen(cmd, stdout=PIPE, stderr=PIPE, shell=True, |
|
319 | popen = Popen(cmd, stdout=PIPE, stderr=PIPE, shell=True, | |
320 | cwd=self.repository.path) |
|
320 | cwd=self.repository.path) | |
321 |
|
321 | |||
322 | buffer_size = 1024 * 8 |
|
322 | buffer_size = 1024 * 8 | |
323 | chunk = popen.stdout.read(buffer_size) |
|
323 | chunk = popen.stdout.read(buffer_size) | |
324 | while chunk: |
|
324 | while chunk: | |
325 | stream.write(chunk) |
|
325 | stream.write(chunk) | |
326 | chunk = popen.stdout.read(buffer_size) |
|
326 | chunk = popen.stdout.read(buffer_size) | |
327 | # Make sure all descriptors would be read |
|
327 | # Make sure all descriptors would be read | |
328 | popen.communicate() |
|
328 | popen.communicate() | |
329 |
|
329 | |||
330 | def get_nodes(self, path): |
|
330 | def get_nodes(self, path): | |
331 | if self._get_kind(path) != NodeKind.DIR: |
|
331 | if self._get_kind(path) != NodeKind.DIR: | |
332 | raise ChangesetError("Directory does not exist for revision %r at " |
|
332 | raise ChangesetError("Directory does not exist for revision %r at " | |
333 | " %r" % (self.revision, path)) |
|
333 | " %r" % (self.revision, path)) | |
334 | path = self._fix_path(path) |
|
334 | path = self._fix_path(path) | |
335 | id = self._get_id_for_path(path) |
|
335 | id = self._get_id_for_path(path) | |
336 | tree = self.repository._repo[id] |
|
336 | tree = self.repository._repo[id] | |
337 | dirnodes = [] |
|
337 | dirnodes = [] | |
338 | filenodes = [] |
|
338 | filenodes = [] | |
339 | for name, stat, id in tree.iteritems(): |
|
339 | for name, stat, id in tree.iteritems(): | |
340 | obj = self.repository._repo.get_object(id) |
|
340 | obj = self.repository._repo.get_object(id) | |
341 | if path != '': |
|
341 | if path != '': | |
342 | obj_path = '/'.join((path, name)) |
|
342 | obj_path = '/'.join((path, name)) | |
343 | else: |
|
343 | else: | |
344 | obj_path = name |
|
344 | obj_path = name | |
345 | if obj_path not in self._stat_modes: |
|
345 | if obj_path not in self._stat_modes: | |
346 | self._stat_modes[obj_path] = stat |
|
346 | self._stat_modes[obj_path] = stat | |
347 | if isinstance(obj, objects.Tree): |
|
347 | if isinstance(obj, objects.Tree): | |
348 | dirnodes.append(DirNode(obj_path, changeset=self)) |
|
348 | dirnodes.append(DirNode(obj_path, changeset=self)) | |
349 | elif isinstance(obj, objects.Blob): |
|
349 | elif isinstance(obj, objects.Blob): | |
350 | filenodes.append(FileNode(obj_path, changeset=self, mode=stat)) |
|
350 | filenodes.append(FileNode(obj_path, changeset=self, mode=stat)) | |
351 | else: |
|
351 | else: | |
352 | raise ChangesetError("Requested object should be Tree " |
|
352 | raise ChangesetError("Requested object should be Tree " | |
353 | "or Blob, is %r" % type(obj)) |
|
353 | "or Blob, is %r" % type(obj)) | |
354 | nodes = dirnodes + filenodes |
|
354 | nodes = dirnodes + filenodes | |
355 | for node in nodes: |
|
355 | for node in nodes: | |
356 | if not node.path in self.nodes: |
|
356 | if not node.path in self.nodes: | |
357 | self.nodes[node.path] = node |
|
357 | self.nodes[node.path] = node | |
358 | nodes.sort() |
|
358 | nodes.sort() | |
359 | return nodes |
|
359 | return nodes | |
360 |
|
360 | |||
361 | def get_node(self, path): |
|
361 | def get_node(self, path): | |
362 | if isinstance(path, unicode): |
|
362 | if isinstance(path, unicode): | |
363 | path = path.encode('utf-8') |
|
363 | path = path.encode('utf-8') | |
364 | path = self._fix_path(path) |
|
364 | path = self._fix_path(path) | |
365 | if not path in self.nodes: |
|
365 | if not path in self.nodes: | |
366 | try: |
|
366 | try: | |
367 | id = self._get_id_for_path(path) |
|
367 | id = self._get_id_for_path(path) | |
368 | except ChangesetError: |
|
368 | except ChangesetError: | |
369 | raise NodeDoesNotExistError("Cannot find one of parents' " |
|
369 | raise NodeDoesNotExistError("Cannot find one of parents' " | |
370 | "directories for a given path: %s" % path) |
|
370 | "directories for a given path: %s" % path) | |
371 | obj = self.repository._repo.get_object(id) |
|
371 | obj = self.repository._repo.get_object(id) | |
372 | if isinstance(obj, objects.Tree): |
|
372 | if isinstance(obj, objects.Tree): | |
373 | if path == '': |
|
373 | if path == '': | |
374 | node = RootNode(changeset=self) |
|
374 | node = RootNode(changeset=self) | |
375 | else: |
|
375 | else: | |
376 | node = DirNode(path, changeset=self) |
|
376 | node = DirNode(path, changeset=self) | |
377 | node._tree = obj |
|
377 | node._tree = obj | |
378 | elif isinstance(obj, objects.Blob): |
|
378 | elif isinstance(obj, objects.Blob): | |
379 | node = FileNode(path, changeset=self) |
|
379 | node = FileNode(path, changeset=self) | |
380 | node._blob = obj |
|
380 | node._blob = obj | |
381 | else: |
|
381 | else: | |
382 | raise NodeDoesNotExistError("There is no file nor directory " |
|
382 | raise NodeDoesNotExistError("There is no file nor directory " | |
383 | "at the given path %r at revision %r" |
|
383 | "at the given path %r at revision %r" | |
384 | % (path, self.short_id)) |
|
384 | % (path, self.short_id)) | |
385 | # cache node |
|
385 | # cache node | |
386 | self.nodes[path] = node |
|
386 | self.nodes[path] = node | |
387 | return self.nodes[path] |
|
387 | return self.nodes[path] | |
388 |
|
388 | |||
389 | @LazyProperty |
|
389 | @LazyProperty | |
390 | def affected_files(self): |
|
390 | def affected_files(self): | |
391 | """ |
|
391 | """ | |
392 | Get's a fast accessible file changes for given changeset |
|
392 | Get's a fast accessible file changes for given changeset | |
393 | """ |
|
393 | """ | |
394 |
|
394 | |||
395 | return self.added + self.changed |
|
395 | return self.added + self.changed | |
396 |
|
396 | |||
397 | @LazyProperty |
|
397 | @LazyProperty | |
398 | def _diff_name_status(self): |
|
398 | def _diff_name_status(self): | |
399 | output = [] |
|
399 | output = [] | |
400 | for parent in self.parents: |
|
400 | for parent in self.parents: | |
401 | cmd = 'diff --name-status %s %s' % (parent.raw_id, self.raw_id) |
|
401 | cmd = 'diff --name-status %s %s' % (parent.raw_id, self.raw_id) | |
402 | so, se = self.repository.run_git_command(cmd) |
|
402 | so, se = self.repository.run_git_command(cmd) | |
403 | output.append(so.strip()) |
|
403 | output.append(so.strip()) | |
404 | return '\n'.join(output) |
|
404 | return '\n'.join(output) | |
405 |
|
405 | |||
406 | def _get_paths_for_status(self, status): |
|
406 | def _get_paths_for_status(self, status): | |
407 | """ |
|
407 | """ | |
408 | Returns sorted list of paths for given ``status``. |
|
408 | Returns sorted list of paths for given ``status``. | |
409 |
|
409 | |||
410 | :param status: one of: *added*, *modified* or *deleted* |
|
410 | :param status: one of: *added*, *modified* or *deleted* | |
411 | """ |
|
411 | """ | |
412 | paths = set() |
|
412 | paths = set() | |
413 | char = status[0].upper() |
|
413 | char = status[0].upper() | |
414 | for line in self._diff_name_status.splitlines(): |
|
414 | for line in self._diff_name_status.splitlines(): | |
415 | if not line: |
|
415 | if not line: | |
416 | continue |
|
416 | continue | |
417 | if line.startswith(char): |
|
417 | if line.startswith(char): | |
418 | splitted = line.split(char,1) |
|
418 | splitted = line.split(char,1) | |
419 | if not len(splitted) == 2: |
|
419 | if not len(splitted) == 2: | |
420 | raise VCSError("Couldn't parse diff result:\n%s\n\n and " |
|
420 | raise VCSError("Couldn't parse diff result:\n%s\n\n and " | |
421 | "particularly that line: %s" % (self._diff_name_status, |
|
421 | "particularly that line: %s" % (self._diff_name_status, | |
422 | line)) |
|
422 | line)) | |
423 | paths.add(splitted[1].strip()) |
|
423 | paths.add(splitted[1].strip()) | |
424 | return sorted(paths) |
|
424 | return sorted(paths) | |
425 |
|
425 | |||
426 | @LazyProperty |
|
426 | @LazyProperty | |
427 | def added(self): |
|
427 | def added(self): | |
428 | """ |
|
428 | """ | |
429 | Returns list of added ``FileNode`` objects. |
|
429 | Returns list of added ``FileNode`` objects. | |
430 | """ |
|
430 | """ | |
431 | if not self.parents: |
|
431 | if not self.parents: | |
432 | return list(self._get_file_nodes()) |
|
432 | return list(self._get_file_nodes()) | |
433 | return [self.get_node(path) for path in self._get_paths_for_status('added')] |
|
433 | return [self.get_node(path) for path in self._get_paths_for_status('added')] | |
434 |
|
434 | |||
435 | @LazyProperty |
|
435 | @LazyProperty | |
436 | def changed(self): |
|
436 | def changed(self): | |
437 | """ |
|
437 | """ | |
438 | Returns list of modified ``FileNode`` objects. |
|
438 | Returns list of modified ``FileNode`` objects. | |
439 | """ |
|
439 | """ | |
440 | if not self.parents: |
|
440 | if not self.parents: | |
441 | return [] |
|
441 | return [] | |
442 | return [self.get_node(path) for path in self._get_paths_for_status('modified')] |
|
442 | return [self.get_node(path) for path in self._get_paths_for_status('modified')] | |
443 |
|
443 | |||
444 | @LazyProperty |
|
444 | @LazyProperty | |
445 | def removed(self): |
|
445 | def removed(self): | |
446 | """ |
|
446 | """ | |
447 | Returns list of removed ``FileNode`` objects. |
|
447 | Returns list of removed ``FileNode`` objects. | |
448 | """ |
|
448 | """ | |
449 | if not self.parents: |
|
449 | if not self.parents: | |
450 | return [] |
|
450 | return [] | |
451 | return [RemovedFileNode(path) for path in self._get_paths_for_status('deleted')] |
|
451 | return [RemovedFileNode(path) for path in self._get_paths_for_status('deleted')] |
@@ -1,338 +1,337 b'' | |||||
1 | import os |
|
1 | import os | |
2 | import posixpath |
|
2 | import posixpath | |
3 |
|
3 | |||
4 | from rhodecode.lib.vcs.backends.base import BaseChangeset |
|
4 | from rhodecode.lib.vcs.backends.base import BaseChangeset | |
5 | from rhodecode.lib.vcs.conf import settings |
|
5 | from rhodecode.lib.vcs.conf import settings | |
6 | from rhodecode.lib.vcs.exceptions import ChangesetDoesNotExistError, \ |
|
6 | from rhodecode.lib.vcs.exceptions import ChangesetDoesNotExistError, \ | |
7 | ChangesetError, ImproperArchiveTypeError, NodeDoesNotExistError, VCSError |
|
7 | ChangesetError, ImproperArchiveTypeError, NodeDoesNotExistError, VCSError | |
8 | from rhodecode.lib.vcs.nodes import AddedFileNodesGenerator, ChangedFileNodesGenerator, \ |
|
8 | from rhodecode.lib.vcs.nodes import AddedFileNodesGenerator, ChangedFileNodesGenerator, \ | |
9 | DirNode, FileNode, NodeKind, RemovedFileNodesGenerator, RootNode |
|
9 | DirNode, FileNode, NodeKind, RemovedFileNodesGenerator, RootNode | |
10 |
|
10 | |||
11 | from rhodecode.lib.vcs.utils import safe_str, safe_unicode, date_fromtimestamp |
|
11 | from rhodecode.lib.vcs.utils import safe_str, safe_unicode, date_fromtimestamp | |
12 | from rhodecode.lib.vcs.utils.lazy import LazyProperty |
|
12 | from rhodecode.lib.vcs.utils.lazy import LazyProperty | |
13 | from rhodecode.lib.vcs.utils.paths import get_dirs_for_path |
|
13 | from rhodecode.lib.vcs.utils.paths import get_dirs_for_path | |
14 |
|
14 | |||
15 | from ...utils.hgcompat import archival, hex |
|
15 | from ...utils.hgcompat import archival, hex | |
16 |
|
16 | |||
17 |
|
17 | |||
18 | class MercurialChangeset(BaseChangeset): |
|
18 | class MercurialChangeset(BaseChangeset): | |
19 | """ |
|
19 | """ | |
20 | Represents state of the repository at the single revision. |
|
20 | Represents state of the repository at the single revision. | |
21 | """ |
|
21 | """ | |
22 |
|
22 | |||
23 | def __init__(self, repository, revision): |
|
23 | def __init__(self, repository, revision): | |
24 | self.repository = repository |
|
24 | self.repository = repository | |
25 | self.raw_id = revision |
|
25 | self.raw_id = revision | |
26 | self._ctx = repository._repo[revision] |
|
26 | self._ctx = repository._repo[revision] | |
27 | self.revision = self._ctx._rev |
|
27 | self.revision = self._ctx._rev | |
28 | self.nodes = {} |
|
28 | self.nodes = {} | |
29 |
|
29 | |||
30 | @LazyProperty |
|
30 | @LazyProperty | |
31 | def tags(self): |
|
31 | def tags(self): | |
32 | return map(safe_unicode, self._ctx.tags()) |
|
32 | return map(safe_unicode, self._ctx.tags()) | |
33 |
|
33 | |||
34 | @LazyProperty |
|
34 | @LazyProperty | |
35 | def branch(self): |
|
35 | def branch(self): | |
36 | return safe_unicode(self._ctx.branch()) |
|
36 | return safe_unicode(self._ctx.branch()) | |
37 |
|
37 | |||
38 | @LazyProperty |
|
38 | @LazyProperty | |
39 | def message(self): |
|
39 | def message(self): | |
40 | return safe_unicode(self._ctx.description()) |
|
40 | return safe_unicode(self._ctx.description()) | |
41 |
|
41 | |||
42 | @LazyProperty |
|
42 | @LazyProperty | |
43 | def author(self): |
|
43 | def author(self): | |
44 | return safe_unicode(self._ctx.user()) |
|
44 | return safe_unicode(self._ctx.user()) | |
45 |
|
45 | |||
46 | @LazyProperty |
|
46 | @LazyProperty | |
47 | def date(self): |
|
47 | def date(self): | |
48 | return date_fromtimestamp(*self._ctx.date()) |
|
48 | return date_fromtimestamp(*self._ctx.date()) | |
49 |
|
49 | |||
50 | @LazyProperty |
|
50 | @LazyProperty | |
51 | def status(self): |
|
51 | def status(self): | |
52 | """ |
|
52 | """ | |
53 | Returns modified, added, removed, deleted files for current changeset |
|
53 | Returns modified, added, removed, deleted files for current changeset | |
54 | """ |
|
54 | """ | |
55 | return self.repository._repo.status(self._ctx.p1().node(), |
|
55 | return self.repository._repo.status(self._ctx.p1().node(), | |
56 | self._ctx.node()) |
|
56 | self._ctx.node()) | |
57 |
|
57 | |||
58 | @LazyProperty |
|
58 | @LazyProperty | |
59 | def _file_paths(self): |
|
59 | def _file_paths(self): | |
60 | return list(self._ctx) |
|
60 | return list(self._ctx) | |
61 |
|
61 | |||
62 | @LazyProperty |
|
62 | @LazyProperty | |
63 | def _dir_paths(self): |
|
63 | def _dir_paths(self): | |
64 | p = list(set(get_dirs_for_path(*self._file_paths))) |
|
64 | p = list(set(get_dirs_for_path(*self._file_paths))) | |
65 | p.insert(0, '') |
|
65 | p.insert(0, '') | |
66 | return p |
|
66 | return p | |
67 |
|
67 | |||
68 | @LazyProperty |
|
68 | @LazyProperty | |
69 | def _paths(self): |
|
69 | def _paths(self): | |
70 | return self._dir_paths + self._file_paths |
|
70 | return self._dir_paths + self._file_paths | |
71 |
|
71 | |||
72 | @LazyProperty |
|
72 | @LazyProperty | |
73 | def id(self): |
|
73 | def id(self): | |
74 | if self.last: |
|
74 | if self.last: | |
75 | return u'tip' |
|
75 | return u'tip' | |
76 | return self.short_id |
|
76 | return self.short_id | |
77 |
|
77 | |||
78 | @LazyProperty |
|
78 | @LazyProperty | |
79 | def short_id(self): |
|
79 | def short_id(self): | |
80 | return self.raw_id[:12] |
|
80 | return self.raw_id[:12] | |
81 |
|
81 | |||
82 | @LazyProperty |
|
82 | @LazyProperty | |
83 | def parents(self): |
|
83 | def parents(self): | |
84 | """ |
|
84 | """ | |
85 | Returns list of parents changesets. |
|
85 | Returns list of parents changesets. | |
86 | """ |
|
86 | """ | |
87 | return [self.repository.get_changeset(parent.rev()) |
|
87 | return [self.repository.get_changeset(parent.rev()) | |
88 | for parent in self._ctx.parents() if parent.rev() >= 0] |
|
88 | for parent in self._ctx.parents() if parent.rev() >= 0] | |
89 |
|
89 | |||
90 | def next(self, branch=None): |
|
90 | def next(self, branch=None): | |
91 |
|
91 | |||
92 | if branch and self.branch != branch: |
|
92 | if branch and self.branch != branch: | |
93 | raise VCSError('Branch option used on changeset not belonging ' |
|
93 | raise VCSError('Branch option used on changeset not belonging ' | |
94 | 'to that branch') |
|
94 | 'to that branch') | |
95 |
|
95 | |||
96 | def _next(changeset, branch): |
|
96 | def _next(changeset, branch): | |
97 | try: |
|
97 | try: | |
98 | next_ = changeset.revision + 1 |
|
98 | next_ = changeset.revision + 1 | |
99 | next_rev = changeset.repository.revisions[next_] |
|
99 | next_rev = changeset.repository.revisions[next_] | |
100 | except IndexError: |
|
100 | except IndexError: | |
101 | raise ChangesetDoesNotExistError |
|
101 | raise ChangesetDoesNotExistError | |
102 | cs = changeset.repository.get_changeset(next_rev) |
|
102 | cs = changeset.repository.get_changeset(next_rev) | |
103 |
|
103 | |||
104 | if branch and branch != cs.branch: |
|
104 | if branch and branch != cs.branch: | |
105 | return _next(cs, branch) |
|
105 | return _next(cs, branch) | |
106 |
|
106 | |||
107 | return cs |
|
107 | return cs | |
108 |
|
108 | |||
109 | return _next(self, branch) |
|
109 | return _next(self, branch) | |
110 |
|
110 | |||
111 | def prev(self, branch=None): |
|
111 | def prev(self, branch=None): | |
112 | if branch and self.branch != branch: |
|
112 | if branch and self.branch != branch: | |
113 | raise VCSError('Branch option used on changeset not belonging ' |
|
113 | raise VCSError('Branch option used on changeset not belonging ' | |
114 | 'to that branch') |
|
114 | 'to that branch') | |
115 |
|
115 | |||
116 | def _prev(changeset, branch): |
|
116 | def _prev(changeset, branch): | |
117 | try: |
|
117 | try: | |
118 | prev_ = changeset.revision - 1 |
|
118 | prev_ = changeset.revision - 1 | |
119 | if prev_ < 0: |
|
119 | if prev_ < 0: | |
120 | raise IndexError |
|
120 | raise IndexError | |
121 | prev_rev = changeset.repository.revisions[prev_] |
|
121 | prev_rev = changeset.repository.revisions[prev_] | |
122 | except IndexError: |
|
122 | except IndexError: | |
123 | raise ChangesetDoesNotExistError |
|
123 | raise ChangesetDoesNotExistError | |
124 |
|
124 | |||
125 | cs = changeset.repository.get_changeset(prev_rev) |
|
125 | cs = changeset.repository.get_changeset(prev_rev) | |
126 |
|
126 | |||
127 | if branch and branch != cs.branch: |
|
127 | if branch and branch != cs.branch: | |
128 | return _prev(cs, branch) |
|
128 | return _prev(cs, branch) | |
129 |
|
129 | |||
130 | return cs |
|
130 | return cs | |
131 |
|
131 | |||
132 | return _prev(self, branch) |
|
132 | return _prev(self, branch) | |
133 |
|
133 | |||
134 | def _fix_path(self, path): |
|
134 | def _fix_path(self, path): | |
135 | """ |
|
135 | """ | |
136 | Paths are stored without trailing slash so we need to get rid off it if |
|
136 | Paths are stored without trailing slash so we need to get rid off it if | |
137 | needed. Also mercurial keeps filenodes as str so we need to decode |
|
137 | needed. Also mercurial keeps filenodes as str so we need to decode | |
138 | from unicode to str |
|
138 | from unicode to str | |
139 | """ |
|
139 | """ | |
140 | if path.endswith('/'): |
|
140 | if path.endswith('/'): | |
141 | path = path.rstrip('/') |
|
141 | path = path.rstrip('/') | |
142 |
|
142 | |||
143 | return safe_str(path) |
|
143 | return safe_str(path) | |
144 |
|
144 | |||
145 | def _get_kind(self, path): |
|
145 | def _get_kind(self, path): | |
146 | path = self._fix_path(path) |
|
146 | path = self._fix_path(path) | |
147 | if path in self._file_paths: |
|
147 | if path in self._file_paths: | |
148 | return NodeKind.FILE |
|
148 | return NodeKind.FILE | |
149 | elif path in self._dir_paths: |
|
149 | elif path in self._dir_paths: | |
150 | return NodeKind.DIR |
|
150 | return NodeKind.DIR | |
151 | else: |
|
151 | else: | |
152 | raise ChangesetError("Node does not exist at the given path %r" |
|
152 | raise ChangesetError("Node does not exist at the given path %r" | |
153 | % (path)) |
|
153 | % (path)) | |
154 |
|
154 | |||
155 | def _get_filectx(self, path): |
|
155 | def _get_filectx(self, path): | |
156 | path = self._fix_path(path) |
|
156 | path = self._fix_path(path) | |
157 | if self._get_kind(path) != NodeKind.FILE: |
|
157 | if self._get_kind(path) != NodeKind.FILE: | |
158 | raise ChangesetError("File does not exist for revision %r at " |
|
158 | raise ChangesetError("File does not exist for revision %r at " | |
159 | " %r" % (self.revision, path)) |
|
159 | " %r" % (self.revision, path)) | |
160 | return self._ctx.filectx(path) |
|
160 | return self._ctx.filectx(path) | |
161 |
|
161 | |||
162 | def get_file_mode(self, path): |
|
162 | def get_file_mode(self, path): | |
163 | """ |
|
163 | """ | |
164 | Returns stat mode of the file at the given ``path``. |
|
164 | Returns stat mode of the file at the given ``path``. | |
165 | """ |
|
165 | """ | |
166 | fctx = self._get_filectx(path) |
|
166 | fctx = self._get_filectx(path) | |
167 | if 'x' in fctx.flags(): |
|
167 | if 'x' in fctx.flags(): | |
168 | return 0100755 |
|
168 | return 0100755 | |
169 | else: |
|
169 | else: | |
170 | return 0100644 |
|
170 | return 0100644 | |
171 |
|
171 | |||
172 | def get_file_content(self, path): |
|
172 | def get_file_content(self, path): | |
173 | """ |
|
173 | """ | |
174 | Returns content of the file at given ``path``. |
|
174 | Returns content of the file at given ``path``. | |
175 | """ |
|
175 | """ | |
176 | fctx = self._get_filectx(path) |
|
176 | fctx = self._get_filectx(path) | |
177 | return fctx.data() |
|
177 | return fctx.data() | |
178 |
|
178 | |||
179 | def get_file_size(self, path): |
|
179 | def get_file_size(self, path): | |
180 | """ |
|
180 | """ | |
181 | Returns size of the file at given ``path``. |
|
181 | Returns size of the file at given ``path``. | |
182 | """ |
|
182 | """ | |
183 | fctx = self._get_filectx(path) |
|
183 | fctx = self._get_filectx(path) | |
184 | return fctx.size() |
|
184 | return fctx.size() | |
185 |
|
185 | |||
186 | def get_file_changeset(self, path): |
|
186 | def get_file_changeset(self, path): | |
187 | """ |
|
187 | """ | |
188 | Returns last commit of the file at the given ``path``. |
|
188 | Returns last commit of the file at the given ``path``. | |
189 | """ |
|
189 | """ | |
190 |
|
|
190 | node = self.get_node(path) | |
191 | changeset = self.repository.get_changeset(fctx.linkrev()) |
|
191 | return node.history[0] | |
192 | return changeset |
|
|||
193 |
|
192 | |||
194 | def get_file_history(self, path): |
|
193 | def get_file_history(self, path): | |
195 | """ |
|
194 | """ | |
196 | Returns history of file as reversed list of ``Changeset`` objects for |
|
195 | Returns history of file as reversed list of ``Changeset`` objects for | |
197 | which file at given ``path`` has been modified. |
|
196 | which file at given ``path`` has been modified. | |
198 | """ |
|
197 | """ | |
199 | fctx = self._get_filectx(path) |
|
198 | fctx = self._get_filectx(path) | |
200 | nodes = [fctx.filectx(x).node() for x in fctx.filelog()] |
|
199 | nodes = [fctx.filectx(x).node() for x in fctx.filelog()] | |
201 | changesets = [self.repository.get_changeset(hex(node)) |
|
200 | changesets = [self.repository.get_changeset(hex(node)) | |
202 | for node in reversed(nodes)] |
|
201 | for node in reversed(nodes)] | |
203 | return changesets |
|
202 | return changesets | |
204 |
|
203 | |||
205 | def get_file_annotate(self, path): |
|
204 | def get_file_annotate(self, path): | |
206 | """ |
|
205 | """ | |
207 | Returns a list of three element tuples with lineno,changeset and line |
|
206 | Returns a list of three element tuples with lineno,changeset and line | |
208 | """ |
|
207 | """ | |
209 | fctx = self._get_filectx(path) |
|
208 | fctx = self._get_filectx(path) | |
210 | annotate = [] |
|
209 | annotate = [] | |
211 | for i, annotate_data in enumerate(fctx.annotate()): |
|
210 | for i, annotate_data in enumerate(fctx.annotate()): | |
212 | ln_no = i + 1 |
|
211 | ln_no = i + 1 | |
213 | annotate.append((ln_no, self.repository\ |
|
212 | annotate.append((ln_no, self.repository\ | |
214 | .get_changeset(hex(annotate_data[0].node())), |
|
213 | .get_changeset(hex(annotate_data[0].node())), | |
215 | annotate_data[1],)) |
|
214 | annotate_data[1],)) | |
216 |
|
215 | |||
217 | return annotate |
|
216 | return annotate | |
218 |
|
217 | |||
219 | def fill_archive(self, stream=None, kind='tgz', prefix=None, |
|
218 | def fill_archive(self, stream=None, kind='tgz', prefix=None, | |
220 | subrepos=False): |
|
219 | subrepos=False): | |
221 | """ |
|
220 | """ | |
222 | Fills up given stream. |
|
221 | Fills up given stream. | |
223 |
|
222 | |||
224 | :param stream: file like object. |
|
223 | :param stream: file like object. | |
225 | :param kind: one of following: ``zip``, ``tgz`` or ``tbz2``. |
|
224 | :param kind: one of following: ``zip``, ``tgz`` or ``tbz2``. | |
226 | Default: ``tgz``. |
|
225 | Default: ``tgz``. | |
227 | :param prefix: name of root directory in archive. |
|
226 | :param prefix: name of root directory in archive. | |
228 | Default is repository name and changeset's raw_id joined with dash |
|
227 | Default is repository name and changeset's raw_id joined with dash | |
229 | (``repo-tip.<KIND>``). |
|
228 | (``repo-tip.<KIND>``). | |
230 | :param subrepos: include subrepos in this archive. |
|
229 | :param subrepos: include subrepos in this archive. | |
231 |
|
230 | |||
232 | :raise ImproperArchiveTypeError: If given kind is wrong. |
|
231 | :raise ImproperArchiveTypeError: If given kind is wrong. | |
233 | :raise VcsError: If given stream is None |
|
232 | :raise VcsError: If given stream is None | |
234 | """ |
|
233 | """ | |
235 |
|
234 | |||
236 | allowed_kinds = settings.ARCHIVE_SPECS.keys() |
|
235 | allowed_kinds = settings.ARCHIVE_SPECS.keys() | |
237 | if kind not in allowed_kinds: |
|
236 | if kind not in allowed_kinds: | |
238 | raise ImproperArchiveTypeError('Archive kind not supported use one' |
|
237 | raise ImproperArchiveTypeError('Archive kind not supported use one' | |
239 | 'of %s', allowed_kinds) |
|
238 | 'of %s', allowed_kinds) | |
240 |
|
239 | |||
241 | if stream is None: |
|
240 | if stream is None: | |
242 | raise VCSError('You need to pass in a valid stream for filling' |
|
241 | raise VCSError('You need to pass in a valid stream for filling' | |
243 | ' with archival data') |
|
242 | ' with archival data') | |
244 |
|
243 | |||
245 | if prefix is None: |
|
244 | if prefix is None: | |
246 | prefix = '%s-%s' % (self.repository.name, self.short_id) |
|
245 | prefix = '%s-%s' % (self.repository.name, self.short_id) | |
247 | elif prefix.startswith('/'): |
|
246 | elif prefix.startswith('/'): | |
248 | raise VCSError("Prefix cannot start with leading slash") |
|
247 | raise VCSError("Prefix cannot start with leading slash") | |
249 | elif prefix.strip() == '': |
|
248 | elif prefix.strip() == '': | |
250 | raise VCSError("Prefix cannot be empty") |
|
249 | raise VCSError("Prefix cannot be empty") | |
251 |
|
250 | |||
252 | archival.archive(self.repository._repo, stream, self.raw_id, |
|
251 | archival.archive(self.repository._repo, stream, self.raw_id, | |
253 | kind, prefix=prefix, subrepos=subrepos) |
|
252 | kind, prefix=prefix, subrepos=subrepos) | |
254 |
|
253 | |||
255 | #stream.close() |
|
254 | #stream.close() | |
256 |
|
255 | |||
257 | if stream.closed and hasattr(stream, 'name'): |
|
256 | if stream.closed and hasattr(stream, 'name'): | |
258 | stream = open(stream.name, 'rb') |
|
257 | stream = open(stream.name, 'rb') | |
259 | elif hasattr(stream, 'mode') and 'r' not in stream.mode: |
|
258 | elif hasattr(stream, 'mode') and 'r' not in stream.mode: | |
260 | stream = open(stream.name, 'rb') |
|
259 | stream = open(stream.name, 'rb') | |
261 | else: |
|
260 | else: | |
262 | stream.seek(0) |
|
261 | stream.seek(0) | |
263 |
|
262 | |||
264 | def get_nodes(self, path): |
|
263 | def get_nodes(self, path): | |
265 | """ |
|
264 | """ | |
266 | Returns combined ``DirNode`` and ``FileNode`` objects list representing |
|
265 | Returns combined ``DirNode`` and ``FileNode`` objects list representing | |
267 | state of changeset at the given ``path``. If node at the given ``path`` |
|
266 | state of changeset at the given ``path``. If node at the given ``path`` | |
268 | is not instance of ``DirNode``, ChangesetError would be raised. |
|
267 | is not instance of ``DirNode``, ChangesetError would be raised. | |
269 | """ |
|
268 | """ | |
270 |
|
269 | |||
271 | if self._get_kind(path) != NodeKind.DIR: |
|
270 | if self._get_kind(path) != NodeKind.DIR: | |
272 | raise ChangesetError("Directory does not exist for revision %r at " |
|
271 | raise ChangesetError("Directory does not exist for revision %r at " | |
273 | " %r" % (self.revision, path)) |
|
272 | " %r" % (self.revision, path)) | |
274 | path = self._fix_path(path) |
|
273 | path = self._fix_path(path) | |
275 | filenodes = [FileNode(f, changeset=self) for f in self._file_paths |
|
274 | filenodes = [FileNode(f, changeset=self) for f in self._file_paths | |
276 | if os.path.dirname(f) == path] |
|
275 | if os.path.dirname(f) == path] | |
277 | dirs = path == '' and '' or [d for d in self._dir_paths |
|
276 | dirs = path == '' and '' or [d for d in self._dir_paths | |
278 | if d and posixpath.dirname(d) == path] |
|
277 | if d and posixpath.dirname(d) == path] | |
279 | dirnodes = [DirNode(d, changeset=self) for d in dirs |
|
278 | dirnodes = [DirNode(d, changeset=self) for d in dirs | |
280 | if os.path.dirname(d) == path] |
|
279 | if os.path.dirname(d) == path] | |
281 | nodes = dirnodes + filenodes |
|
280 | nodes = dirnodes + filenodes | |
282 | # cache nodes |
|
281 | # cache nodes | |
283 | for node in nodes: |
|
282 | for node in nodes: | |
284 | self.nodes[node.path] = node |
|
283 | self.nodes[node.path] = node | |
285 | nodes.sort() |
|
284 | nodes.sort() | |
286 | return nodes |
|
285 | return nodes | |
287 |
|
286 | |||
288 | def get_node(self, path): |
|
287 | def get_node(self, path): | |
289 | """ |
|
288 | """ | |
290 | Returns ``Node`` object from the given ``path``. If there is no node at |
|
289 | Returns ``Node`` object from the given ``path``. If there is no node at | |
291 | the given ``path``, ``ChangesetError`` would be raised. |
|
290 | the given ``path``, ``ChangesetError`` would be raised. | |
292 | """ |
|
291 | """ | |
293 |
|
292 | |||
294 | path = self._fix_path(path) |
|
293 | path = self._fix_path(path) | |
295 |
|
294 | |||
296 | if not path in self.nodes: |
|
295 | if not path in self.nodes: | |
297 | if path in self._file_paths: |
|
296 | if path in self._file_paths: | |
298 | node = FileNode(path, changeset=self) |
|
297 | node = FileNode(path, changeset=self) | |
299 | elif path in self._dir_paths or path in self._dir_paths: |
|
298 | elif path in self._dir_paths or path in self._dir_paths: | |
300 | if path == '': |
|
299 | if path == '': | |
301 | node = RootNode(changeset=self) |
|
300 | node = RootNode(changeset=self) | |
302 | else: |
|
301 | else: | |
303 | node = DirNode(path, changeset=self) |
|
302 | node = DirNode(path, changeset=self) | |
304 | else: |
|
303 | else: | |
305 | raise NodeDoesNotExistError("There is no file nor directory " |
|
304 | raise NodeDoesNotExistError("There is no file nor directory " | |
306 | "at the given path: %r at revision %r" |
|
305 | "at the given path: %r at revision %r" | |
307 | % (path, self.short_id)) |
|
306 | % (path, self.short_id)) | |
308 | # cache node |
|
307 | # cache node | |
309 | self.nodes[path] = node |
|
308 | self.nodes[path] = node | |
310 | return self.nodes[path] |
|
309 | return self.nodes[path] | |
311 |
|
310 | |||
312 | @LazyProperty |
|
311 | @LazyProperty | |
313 | def affected_files(self): |
|
312 | def affected_files(self): | |
314 | """ |
|
313 | """ | |
315 | Get's a fast accessible file changes for given changeset |
|
314 | Get's a fast accessible file changes for given changeset | |
316 | """ |
|
315 | """ | |
317 | return self._ctx.files() |
|
316 | return self._ctx.files() | |
318 |
|
317 | |||
319 | @property |
|
318 | @property | |
320 | def added(self): |
|
319 | def added(self): | |
321 | """ |
|
320 | """ | |
322 | Returns list of added ``FileNode`` objects. |
|
321 | Returns list of added ``FileNode`` objects. | |
323 | """ |
|
322 | """ | |
324 | return AddedFileNodesGenerator([n for n in self.status[1]], self) |
|
323 | return AddedFileNodesGenerator([n for n in self.status[1]], self) | |
325 |
|
324 | |||
326 | @property |
|
325 | @property | |
327 | def changed(self): |
|
326 | def changed(self): | |
328 | """ |
|
327 | """ | |
329 | Returns list of modified ``FileNode`` objects. |
|
328 | Returns list of modified ``FileNode`` objects. | |
330 | """ |
|
329 | """ | |
331 | return ChangedFileNodesGenerator([n for n in self.status[0]], self) |
|
330 | return ChangedFileNodesGenerator([n for n in self.status[0]], self) | |
332 |
|
331 | |||
333 | @property |
|
332 | @property | |
334 | def removed(self): |
|
333 | def removed(self): | |
335 | """ |
|
334 | """ | |
336 | Returns list of removed ``FileNode`` objects. |
|
335 | Returns list of removed ``FileNode`` objects. | |
337 | """ |
|
336 | """ | |
338 | return RemovedFileNodesGenerator([n for n in self.status[2]], self) |
|
337 | return RemovedFileNodesGenerator([n for n in self.status[2]], self) |
@@ -1,551 +1,559 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | """ |
|
2 | """ | |
3 | vcs.nodes |
|
3 | vcs.nodes | |
4 | ~~~~~~~~~ |
|
4 | ~~~~~~~~~ | |
5 |
|
5 | |||
6 | Module holding everything related to vcs nodes. |
|
6 | Module holding everything related to vcs nodes. | |
7 |
|
7 | |||
8 | :created_on: Apr 8, 2010 |
|
8 | :created_on: Apr 8, 2010 | |
9 | :copyright: (c) 2010-2011 by Marcin Kuzminski, Lukasz Balcerzak. |
|
9 | :copyright: (c) 2010-2011 by Marcin Kuzminski, Lukasz Balcerzak. | |
10 | """ |
|
10 | """ | |
11 | import stat |
|
11 | import stat | |
12 | import posixpath |
|
12 | import posixpath | |
13 | import mimetypes |
|
13 | import mimetypes | |
14 |
|
14 | |||
15 | from rhodecode.lib.vcs.utils.lazy import LazyProperty |
|
15 | from rhodecode.lib.vcs.utils.lazy import LazyProperty | |
16 | from rhodecode.lib.vcs.utils import safe_unicode |
|
16 | from rhodecode.lib.vcs.utils import safe_unicode | |
17 | from rhodecode.lib.vcs.exceptions import NodeError |
|
17 | from rhodecode.lib.vcs.exceptions import NodeError | |
18 | from rhodecode.lib.vcs.exceptions import RemovedFileNodeError |
|
18 | from rhodecode.lib.vcs.exceptions import RemovedFileNodeError | |
19 |
|
19 | |||
20 | from pygments import lexers |
|
20 | from pygments import lexers | |
21 |
|
21 | |||
22 |
|
22 | |||
23 | class NodeKind: |
|
23 | class NodeKind: | |
24 | DIR = 1 |
|
24 | DIR = 1 | |
25 | FILE = 2 |
|
25 | FILE = 2 | |
26 |
|
26 | |||
27 |
|
27 | |||
28 | class NodeState: |
|
28 | class NodeState: | |
29 | ADDED = u'added' |
|
29 | ADDED = u'added' | |
30 | CHANGED = u'changed' |
|
30 | CHANGED = u'changed' | |
31 | NOT_CHANGED = u'not changed' |
|
31 | NOT_CHANGED = u'not changed' | |
32 | REMOVED = u'removed' |
|
32 | REMOVED = u'removed' | |
33 |
|
33 | |||
34 |
|
34 | |||
35 | class NodeGeneratorBase(object): |
|
35 | class NodeGeneratorBase(object): | |
36 | """ |
|
36 | """ | |
37 | Base class for removed added and changed filenodes, it's a lazy generator |
|
37 | Base class for removed added and changed filenodes, it's a lazy generator | |
38 | class that will create filenodes only on iteration or call |
|
38 | class that will create filenodes only on iteration or call | |
39 |
|
39 | |||
40 | The len method doesn't need to create filenodes at all |
|
40 | The len method doesn't need to create filenodes at all | |
41 | """ |
|
41 | """ | |
42 |
|
42 | |||
43 | def __init__(self, current_paths, cs): |
|
43 | def __init__(self, current_paths, cs): | |
44 | self.cs = cs |
|
44 | self.cs = cs | |
45 | self.current_paths = current_paths |
|
45 | self.current_paths = current_paths | |
46 |
|
46 | |||
47 | def __call__(self): |
|
47 | def __call__(self): | |
48 | return [n for n in self] |
|
48 | return [n for n in self] | |
49 |
|
49 | |||
50 | def __getslice__(self, i, j): |
|
50 | def __getslice__(self, i, j): | |
51 | for p in self.current_paths[i:j]: |
|
51 | for p in self.current_paths[i:j]: | |
52 | yield self.cs.get_node(p) |
|
52 | yield self.cs.get_node(p) | |
53 |
|
53 | |||
54 | def __len__(self): |
|
54 | def __len__(self): | |
55 | return len(self.current_paths) |
|
55 | return len(self.current_paths) | |
56 |
|
56 | |||
57 | def __iter__(self): |
|
57 | def __iter__(self): | |
58 | for p in self.current_paths: |
|
58 | for p in self.current_paths: | |
59 | yield self.cs.get_node(p) |
|
59 | yield self.cs.get_node(p) | |
60 |
|
60 | |||
61 |
|
61 | |||
62 | class AddedFileNodesGenerator(NodeGeneratorBase): |
|
62 | class AddedFileNodesGenerator(NodeGeneratorBase): | |
63 | """ |
|
63 | """ | |
64 | Class holding Added files for current changeset |
|
64 | Class holding Added files for current changeset | |
65 | """ |
|
65 | """ | |
66 | pass |
|
66 | pass | |
67 |
|
67 | |||
68 |
|
68 | |||
69 | class ChangedFileNodesGenerator(NodeGeneratorBase): |
|
69 | class ChangedFileNodesGenerator(NodeGeneratorBase): | |
70 | """ |
|
70 | """ | |
71 | Class holding Changed files for current changeset |
|
71 | Class holding Changed files for current changeset | |
72 | """ |
|
72 | """ | |
73 | pass |
|
73 | pass | |
74 |
|
74 | |||
75 |
|
75 | |||
76 | class RemovedFileNodesGenerator(NodeGeneratorBase): |
|
76 | class RemovedFileNodesGenerator(NodeGeneratorBase): | |
77 | """ |
|
77 | """ | |
78 | Class holding removed files for current changeset |
|
78 | Class holding removed files for current changeset | |
79 | """ |
|
79 | """ | |
80 | def __iter__(self): |
|
80 | def __iter__(self): | |
81 | for p in self.current_paths: |
|
81 | for p in self.current_paths: | |
82 | yield RemovedFileNode(path=p) |
|
82 | yield RemovedFileNode(path=p) | |
83 |
|
83 | |||
84 | def __getslice__(self, i, j): |
|
84 | def __getslice__(self, i, j): | |
85 | for p in self.current_paths[i:j]: |
|
85 | for p in self.current_paths[i:j]: | |
86 | yield RemovedFileNode(path=p) |
|
86 | yield RemovedFileNode(path=p) | |
87 |
|
87 | |||
88 |
|
88 | |||
89 | class Node(object): |
|
89 | class Node(object): | |
90 | """ |
|
90 | """ | |
91 | Simplest class representing file or directory on repository. SCM backends |
|
91 | Simplest class representing file or directory on repository. SCM backends | |
92 | should use ``FileNode`` and ``DirNode`` subclasses rather than ``Node`` |
|
92 | should use ``FileNode`` and ``DirNode`` subclasses rather than ``Node`` | |
93 | directly. |
|
93 | directly. | |
94 |
|
94 | |||
95 | Node's ``path`` cannot start with slash as we operate on *relative* paths |
|
95 | Node's ``path`` cannot start with slash as we operate on *relative* paths | |
96 | only. Moreover, every single node is identified by the ``path`` attribute, |
|
96 | only. Moreover, every single node is identified by the ``path`` attribute, | |
97 | so it cannot end with slash, too. Otherwise, path could lead to mistakes. |
|
97 | so it cannot end with slash, too. Otherwise, path could lead to mistakes. | |
98 | """ |
|
98 | """ | |
99 |
|
99 | |||
100 | def __init__(self, path, kind): |
|
100 | def __init__(self, path, kind): | |
101 | if path.startswith('/'): |
|
101 | if path.startswith('/'): | |
102 | raise NodeError("Cannot initialize Node objects with slash at " |
|
102 | raise NodeError("Cannot initialize Node objects with slash at " | |
103 | "the beginning as only relative paths are supported") |
|
103 | "the beginning as only relative paths are supported") | |
104 | self.path = path.rstrip('/') |
|
104 | self.path = path.rstrip('/') | |
105 | if path == '' and kind != NodeKind.DIR: |
|
105 | if path == '' and kind != NodeKind.DIR: | |
106 | raise NodeError("Only DirNode and its subclasses may be " |
|
106 | raise NodeError("Only DirNode and its subclasses may be " | |
107 | "initialized with empty path") |
|
107 | "initialized with empty path") | |
108 | self.kind = kind |
|
108 | self.kind = kind | |
109 | #self.dirs, self.files = [], [] |
|
109 | #self.dirs, self.files = [], [] | |
110 | if self.is_root() and not self.is_dir(): |
|
110 | if self.is_root() and not self.is_dir(): | |
111 | raise NodeError("Root node cannot be FILE kind") |
|
111 | raise NodeError("Root node cannot be FILE kind") | |
112 |
|
112 | |||
113 | @LazyProperty |
|
113 | @LazyProperty | |
114 | def parent(self): |
|
114 | def parent(self): | |
115 | parent_path = self.get_parent_path() |
|
115 | parent_path = self.get_parent_path() | |
116 | if parent_path: |
|
116 | if parent_path: | |
117 | if self.changeset: |
|
117 | if self.changeset: | |
118 | return self.changeset.get_node(parent_path) |
|
118 | return self.changeset.get_node(parent_path) | |
119 | return DirNode(parent_path) |
|
119 | return DirNode(parent_path) | |
120 | return None |
|
120 | return None | |
121 |
|
121 | |||
122 | @LazyProperty |
|
122 | @LazyProperty | |
123 | def name(self): |
|
123 | def name(self): | |
124 | """ |
|
124 | """ | |
125 | Returns name of the node so if its path |
|
125 | Returns name of the node so if its path | |
126 | then only last part is returned. |
|
126 | then only last part is returned. | |
127 | """ |
|
127 | """ | |
128 | return safe_unicode(self.path.rstrip('/').split('/')[-1]) |
|
128 | return safe_unicode(self.path.rstrip('/').split('/')[-1]) | |
129 |
|
129 | |||
130 | def _get_kind(self): |
|
130 | def _get_kind(self): | |
131 | return self._kind |
|
131 | return self._kind | |
132 |
|
132 | |||
133 | def _set_kind(self, kind): |
|
133 | def _set_kind(self, kind): | |
134 | if hasattr(self, '_kind'): |
|
134 | if hasattr(self, '_kind'): | |
135 | raise NodeError("Cannot change node's kind") |
|
135 | raise NodeError("Cannot change node's kind") | |
136 | else: |
|
136 | else: | |
137 | self._kind = kind |
|
137 | self._kind = kind | |
138 | # Post setter check (path's trailing slash) |
|
138 | # Post setter check (path's trailing slash) | |
139 | if self.path.endswith('/'): |
|
139 | if self.path.endswith('/'): | |
140 | raise NodeError("Node's path cannot end with slash") |
|
140 | raise NodeError("Node's path cannot end with slash") | |
141 |
|
141 | |||
142 | kind = property(_get_kind, _set_kind) |
|
142 | kind = property(_get_kind, _set_kind) | |
143 |
|
143 | |||
144 | def __cmp__(self, other): |
|
144 | def __cmp__(self, other): | |
145 | """ |
|
145 | """ | |
146 | Comparator using name of the node, needed for quick list sorting. |
|
146 | Comparator using name of the node, needed for quick list sorting. | |
147 | """ |
|
147 | """ | |
148 | kind_cmp = cmp(self.kind, other.kind) |
|
148 | kind_cmp = cmp(self.kind, other.kind) | |
149 | if kind_cmp: |
|
149 | if kind_cmp: | |
150 | return kind_cmp |
|
150 | return kind_cmp | |
151 | return cmp(self.name, other.name) |
|
151 | return cmp(self.name, other.name) | |
152 |
|
152 | |||
153 | def __eq__(self, other): |
|
153 | def __eq__(self, other): | |
154 | for attr in ['name', 'path', 'kind']: |
|
154 | for attr in ['name', 'path', 'kind']: | |
155 | if getattr(self, attr) != getattr(other, attr): |
|
155 | if getattr(self, attr) != getattr(other, attr): | |
156 | return False |
|
156 | return False | |
157 | if self.is_file(): |
|
157 | if self.is_file(): | |
158 | if self.content != other.content: |
|
158 | if self.content != other.content: | |
159 | return False |
|
159 | return False | |
160 | else: |
|
160 | else: | |
161 | # For DirNode's check without entering each dir |
|
161 | # For DirNode's check without entering each dir | |
162 | self_nodes_paths = list(sorted(n.path for n in self.nodes)) |
|
162 | self_nodes_paths = list(sorted(n.path for n in self.nodes)) | |
163 | other_nodes_paths = list(sorted(n.path for n in self.nodes)) |
|
163 | other_nodes_paths = list(sorted(n.path for n in self.nodes)) | |
164 | if self_nodes_paths != other_nodes_paths: |
|
164 | if self_nodes_paths != other_nodes_paths: | |
165 | return False |
|
165 | return False | |
166 | return True |
|
166 | return True | |
167 |
|
167 | |||
168 | def __nq__(self, other): |
|
168 | def __nq__(self, other): | |
169 | return not self.__eq__(other) |
|
169 | return not self.__eq__(other) | |
170 |
|
170 | |||
171 | def __repr__(self): |
|
171 | def __repr__(self): | |
172 | return '<%s %r>' % (self.__class__.__name__, self.path) |
|
172 | return '<%s %r>' % (self.__class__.__name__, self.path) | |
173 |
|
173 | |||
174 | def __str__(self): |
|
174 | def __str__(self): | |
175 | return self.__repr__() |
|
175 | return self.__repr__() | |
176 |
|
176 | |||
177 | def __unicode__(self): |
|
177 | def __unicode__(self): | |
178 | return self.name |
|
178 | return self.name | |
179 |
|
179 | |||
180 | def get_parent_path(self): |
|
180 | def get_parent_path(self): | |
181 | """ |
|
181 | """ | |
182 | Returns node's parent path or empty string if node is root. |
|
182 | Returns node's parent path or empty string if node is root. | |
183 | """ |
|
183 | """ | |
184 | if self.is_root(): |
|
184 | if self.is_root(): | |
185 | return '' |
|
185 | return '' | |
186 | return posixpath.dirname(self.path.rstrip('/')) + '/' |
|
186 | return posixpath.dirname(self.path.rstrip('/')) + '/' | |
187 |
|
187 | |||
188 | def is_file(self): |
|
188 | def is_file(self): | |
189 | """ |
|
189 | """ | |
190 | Returns ``True`` if node's kind is ``NodeKind.FILE``, ``False`` |
|
190 | Returns ``True`` if node's kind is ``NodeKind.FILE``, ``False`` | |
191 | otherwise. |
|
191 | otherwise. | |
192 | """ |
|
192 | """ | |
193 | return self.kind == NodeKind.FILE |
|
193 | return self.kind == NodeKind.FILE | |
194 |
|
194 | |||
195 | def is_dir(self): |
|
195 | def is_dir(self): | |
196 | """ |
|
196 | """ | |
197 | Returns ``True`` if node's kind is ``NodeKind.DIR``, ``False`` |
|
197 | Returns ``True`` if node's kind is ``NodeKind.DIR``, ``False`` | |
198 | otherwise. |
|
198 | otherwise. | |
199 | """ |
|
199 | """ | |
200 | return self.kind == NodeKind.DIR |
|
200 | return self.kind == NodeKind.DIR | |
201 |
|
201 | |||
202 | def is_root(self): |
|
202 | def is_root(self): | |
203 | """ |
|
203 | """ | |
204 | Returns ``True`` if node is a root node and ``False`` otherwise. |
|
204 | Returns ``True`` if node is a root node and ``False`` otherwise. | |
205 | """ |
|
205 | """ | |
206 | return self.kind == NodeKind.DIR and self.path == '' |
|
206 | return self.kind == NodeKind.DIR and self.path == '' | |
207 |
|
207 | |||
208 | @LazyProperty |
|
208 | @LazyProperty | |
209 | def added(self): |
|
209 | def added(self): | |
210 | return self.state is NodeState.ADDED |
|
210 | return self.state is NodeState.ADDED | |
211 |
|
211 | |||
212 | @LazyProperty |
|
212 | @LazyProperty | |
213 | def changed(self): |
|
213 | def changed(self): | |
214 | return self.state is NodeState.CHANGED |
|
214 | return self.state is NodeState.CHANGED | |
215 |
|
215 | |||
216 | @LazyProperty |
|
216 | @LazyProperty | |
217 | def not_changed(self): |
|
217 | def not_changed(self): | |
218 | return self.state is NodeState.NOT_CHANGED |
|
218 | return self.state is NodeState.NOT_CHANGED | |
219 |
|
219 | |||
220 | @LazyProperty |
|
220 | @LazyProperty | |
221 | def removed(self): |
|
221 | def removed(self): | |
222 | return self.state is NodeState.REMOVED |
|
222 | return self.state is NodeState.REMOVED | |
223 |
|
223 | |||
224 |
|
224 | |||
225 | class FileNode(Node): |
|
225 | class FileNode(Node): | |
226 | """ |
|
226 | """ | |
227 | Class representing file nodes. |
|
227 | Class representing file nodes. | |
228 |
|
228 | |||
229 | :attribute: path: path to the node, relative to repostiory's root |
|
229 | :attribute: path: path to the node, relative to repostiory's root | |
230 | :attribute: content: if given arbitrary sets content of the file |
|
230 | :attribute: content: if given arbitrary sets content of the file | |
231 | :attribute: changeset: if given, first time content is accessed, callback |
|
231 | :attribute: changeset: if given, first time content is accessed, callback | |
232 | :attribute: mode: octal stat mode for a node. Default is 0100644. |
|
232 | :attribute: mode: octal stat mode for a node. Default is 0100644. | |
233 | """ |
|
233 | """ | |
234 |
|
234 | |||
235 | def __init__(self, path, content=None, changeset=None, mode=None): |
|
235 | def __init__(self, path, content=None, changeset=None, mode=None): | |
236 | """ |
|
236 | """ | |
237 | Only one of ``content`` and ``changeset`` may be given. Passing both |
|
237 | Only one of ``content`` and ``changeset`` may be given. Passing both | |
238 | would raise ``NodeError`` exception. |
|
238 | would raise ``NodeError`` exception. | |
239 |
|
239 | |||
240 | :param path: relative path to the node |
|
240 | :param path: relative path to the node | |
241 | :param content: content may be passed to constructor |
|
241 | :param content: content may be passed to constructor | |
242 | :param changeset: if given, will use it to lazily fetch content |
|
242 | :param changeset: if given, will use it to lazily fetch content | |
243 | :param mode: octal representation of ST_MODE (i.e. 0100644) |
|
243 | :param mode: octal representation of ST_MODE (i.e. 0100644) | |
244 | """ |
|
244 | """ | |
245 |
|
245 | |||
246 | if content and changeset: |
|
246 | if content and changeset: | |
247 | raise NodeError("Cannot use both content and changeset") |
|
247 | raise NodeError("Cannot use both content and changeset") | |
248 | super(FileNode, self).__init__(path, kind=NodeKind.FILE) |
|
248 | super(FileNode, self).__init__(path, kind=NodeKind.FILE) | |
249 | self.changeset = changeset |
|
249 | self.changeset = changeset | |
250 | self._content = content |
|
250 | self._content = content | |
251 | self._mode = mode or 0100644 |
|
251 | self._mode = mode or 0100644 | |
252 |
|
252 | |||
253 | @LazyProperty |
|
253 | @LazyProperty | |
254 | def mode(self): |
|
254 | def mode(self): | |
255 | """ |
|
255 | """ | |
256 | Returns lazily mode of the FileNode. If ``changeset`` is not set, would |
|
256 | Returns lazily mode of the FileNode. If ``changeset`` is not set, would | |
257 | use value given at initialization or 0100644 (default). |
|
257 | use value given at initialization or 0100644 (default). | |
258 | """ |
|
258 | """ | |
259 | if self.changeset: |
|
259 | if self.changeset: | |
260 | mode = self.changeset.get_file_mode(self.path) |
|
260 | mode = self.changeset.get_file_mode(self.path) | |
261 | else: |
|
261 | else: | |
262 | mode = self._mode |
|
262 | mode = self._mode | |
263 | return mode |
|
263 | return mode | |
264 |
|
264 | |||
265 | @property |
|
265 | @property | |
266 | def content(self): |
|
266 | def content(self): | |
267 | """ |
|
267 | """ | |
268 | Returns lazily content of the FileNode. If possible, would try to |
|
268 | Returns lazily content of the FileNode. If possible, would try to | |
269 | decode content from UTF-8. |
|
269 | decode content from UTF-8. | |
270 | """ |
|
270 | """ | |
271 | if self.changeset: |
|
271 | if self.changeset: | |
272 | content = self.changeset.get_file_content(self.path) |
|
272 | content = self.changeset.get_file_content(self.path) | |
273 | else: |
|
273 | else: | |
274 | content = self._content |
|
274 | content = self._content | |
275 |
|
275 | |||
276 | if bool(content and '\0' in content): |
|
276 | if bool(content and '\0' in content): | |
277 | return content |
|
277 | return content | |
278 | return safe_unicode(content) |
|
278 | return safe_unicode(content) | |
279 |
|
279 | |||
280 | @LazyProperty |
|
280 | @LazyProperty | |
281 | def size(self): |
|
281 | def size(self): | |
282 | if self.changeset: |
|
282 | if self.changeset: | |
283 | return self.changeset.get_file_size(self.path) |
|
283 | return self.changeset.get_file_size(self.path) | |
284 | raise NodeError("Cannot retrieve size of the file without related " |
|
284 | raise NodeError("Cannot retrieve size of the file without related " | |
285 | "changeset attribute") |
|
285 | "changeset attribute") | |
286 |
|
286 | |||
287 | @LazyProperty |
|
287 | @LazyProperty | |
288 | def message(self): |
|
288 | def message(self): | |
289 | if self.changeset: |
|
289 | if self.changeset: | |
290 | return self.last_changeset.message |
|
290 | return self.last_changeset.message | |
291 | raise NodeError("Cannot retrieve message of the file without related " |
|
291 | raise NodeError("Cannot retrieve message of the file without related " | |
292 | "changeset attribute") |
|
292 | "changeset attribute") | |
293 |
|
293 | |||
294 | @LazyProperty |
|
294 | @LazyProperty | |
295 | def last_changeset(self): |
|
295 | def last_changeset(self): | |
296 | if self.changeset: |
|
296 | if self.changeset: | |
297 | return self.changeset.get_file_changeset(self.path) |
|
297 | return self.changeset.get_file_changeset(self.path) | |
298 | raise NodeError("Cannot retrieve last changeset of the file without " |
|
298 | raise NodeError("Cannot retrieve last changeset of the file without " | |
299 | "related changeset attribute") |
|
299 | "related changeset attribute") | |
300 |
|
300 | |||
301 | def get_mimetype(self): |
|
301 | def get_mimetype(self): | |
302 | """ |
|
302 | """ | |
303 | Mimetype is calculated based on the file's content. If ``_mimetype`` |
|
303 | Mimetype is calculated based on the file's content. If ``_mimetype`` | |
304 | attribute is available, it will be returned (backends which store |
|
304 | attribute is available, it will be returned (backends which store | |
305 | mimetypes or can easily recognize them, should set this private |
|
305 | mimetypes or can easily recognize them, should set this private | |
306 | attribute to indicate that type should *NOT* be calculated). |
|
306 | attribute to indicate that type should *NOT* be calculated). | |
307 | """ |
|
307 | """ | |
308 | if hasattr(self, '_mimetype'): |
|
308 | if hasattr(self, '_mimetype'): | |
309 | if (isinstance(self._mimetype,(tuple,list,)) and |
|
309 | if (isinstance(self._mimetype, (tuple, list,)) and | |
310 | len(self._mimetype) == 2): |
|
310 | len(self._mimetype) == 2): | |
311 | return self._mimetype |
|
311 | return self._mimetype | |
312 | else: |
|
312 | else: | |
313 | raise NodeError('given _mimetype attribute must be an 2 ' |
|
313 | raise NodeError('given _mimetype attribute must be an 2 ' | |
314 | 'element list or tuple') |
|
314 | 'element list or tuple') | |
315 |
|
315 | |||
316 | mtype,encoding = mimetypes.guess_type(self.name) |
|
316 | mtype, encoding = mimetypes.guess_type(self.name) | |
317 |
|
317 | |||
318 | if mtype is None: |
|
318 | if mtype is None: | |
319 | if self.is_binary: |
|
319 | if self.is_binary: | |
320 | mtype = 'application/octet-stream' |
|
320 | mtype = 'application/octet-stream' | |
321 | encoding = None |
|
321 | encoding = None | |
322 | else: |
|
322 | else: | |
323 | mtype = 'text/plain' |
|
323 | mtype = 'text/plain' | |
324 | encoding = None |
|
324 | encoding = None | |
325 | return mtype,encoding |
|
325 | return mtype, encoding | |
326 |
|
326 | |||
327 | @LazyProperty |
|
327 | @LazyProperty | |
328 | def mimetype(self): |
|
328 | def mimetype(self): | |
329 | """ |
|
329 | """ | |
330 | Wrapper around full mimetype info. It returns only type of fetched |
|
330 | Wrapper around full mimetype info. It returns only type of fetched | |
331 | mimetype without the encoding part. use get_mimetype function to fetch |
|
331 | mimetype without the encoding part. use get_mimetype function to fetch | |
332 | full set of (type,encoding) |
|
332 | full set of (type,encoding) | |
333 | """ |
|
333 | """ | |
334 | return self.get_mimetype()[0] |
|
334 | return self.get_mimetype()[0] | |
335 |
|
335 | |||
336 | @LazyProperty |
|
336 | @LazyProperty | |
337 | def mimetype_main(self): |
|
337 | def mimetype_main(self): | |
338 | return self.mimetype.split('/')[0] |
|
338 | return self.mimetype.split('/')[0] | |
339 |
|
339 | |||
340 | @LazyProperty |
|
340 | @LazyProperty | |
341 | def lexer(self): |
|
341 | def lexer(self): | |
342 | """ |
|
342 | """ | |
343 | Returns pygment's lexer class. Would try to guess lexer taking file's |
|
343 | Returns pygment's lexer class. Would try to guess lexer taking file's | |
344 | content, name and mimetype. |
|
344 | content, name and mimetype. | |
345 | """ |
|
345 | """ | |
346 | try: |
|
346 | try: | |
347 | lexer = lexers.guess_lexer_for_filename(self.name, self.content) |
|
347 | lexer = lexers.guess_lexer_for_filename(self.name, self.content) | |
348 | except lexers.ClassNotFound: |
|
348 | except lexers.ClassNotFound: | |
349 | lexer = lexers.TextLexer() |
|
349 | lexer = lexers.TextLexer() | |
350 | # returns first alias |
|
350 | # returns first alias | |
351 | return lexer |
|
351 | return lexer | |
352 |
|
352 | |||
353 | @LazyProperty |
|
353 | @LazyProperty | |
354 | def lexer_alias(self): |
|
354 | def lexer_alias(self): | |
355 | """ |
|
355 | """ | |
356 | Returns first alias of the lexer guessed for this file. |
|
356 | Returns first alias of the lexer guessed for this file. | |
357 | """ |
|
357 | """ | |
358 | return self.lexer.aliases[0] |
|
358 | return self.lexer.aliases[0] | |
359 |
|
359 | |||
360 | @LazyProperty |
|
360 | @LazyProperty | |
361 | def history(self): |
|
361 | def history(self): | |
362 | """ |
|
362 | """ | |
363 | Returns a list of changeset for this file in which the file was changed |
|
363 | Returns a list of changeset for this file in which the file was changed | |
364 | """ |
|
364 | """ | |
365 | if self.changeset is None: |
|
365 | if self.changeset is None: | |
366 | raise NodeError('Unable to get changeset for this FileNode') |
|
366 | raise NodeError('Unable to get changeset for this FileNode') | |
367 | return self.changeset.get_file_history(self.path) |
|
367 | return self.changeset.get_file_history(self.path) | |
368 |
|
368 | |||
369 | @LazyProperty |
|
369 | @LazyProperty | |
370 | def annotate(self): |
|
370 | def annotate(self): | |
371 | """ |
|
371 | """ | |
372 | Returns a list of three element tuples with lineno,changeset and line |
|
372 | Returns a list of three element tuples with lineno,changeset and line | |
373 | """ |
|
373 | """ | |
374 | if self.changeset is None: |
|
374 | if self.changeset is None: | |
375 | raise NodeError('Unable to get changeset for this FileNode') |
|
375 | raise NodeError('Unable to get changeset for this FileNode') | |
376 | return self.changeset.get_file_annotate(self.path) |
|
376 | return self.changeset.get_file_annotate(self.path) | |
377 |
|
377 | |||
378 | @LazyProperty |
|
378 | @LazyProperty | |
379 | def state(self): |
|
379 | def state(self): | |
380 | if not self.changeset: |
|
380 | if not self.changeset: | |
381 | raise NodeError("Cannot check state of the node if it's not " |
|
381 | raise NodeError("Cannot check state of the node if it's not " | |
382 | "linked with changeset") |
|
382 | "linked with changeset") | |
383 | elif self.path in (node.path for node in self.changeset.added): |
|
383 | elif self.path in (node.path for node in self.changeset.added): | |
384 | return NodeState.ADDED |
|
384 | return NodeState.ADDED | |
385 | elif self.path in (node.path for node in self.changeset.changed): |
|
385 | elif self.path in (node.path for node in self.changeset.changed): | |
386 | return NodeState.CHANGED |
|
386 | return NodeState.CHANGED | |
387 | else: |
|
387 | else: | |
388 | return NodeState.NOT_CHANGED |
|
388 | return NodeState.NOT_CHANGED | |
389 |
|
389 | |||
390 | @property |
|
390 | @property | |
391 | def is_binary(self): |
|
391 | def is_binary(self): | |
392 | """ |
|
392 | """ | |
393 | Returns True if file has binary content. |
|
393 | Returns True if file has binary content. | |
394 | """ |
|
394 | """ | |
395 | bin = '\0' in self.content |
|
395 | _bin = '\0' in self.content | |
396 | return bin |
|
396 | return _bin | |
397 |
|
397 | |||
398 | @LazyProperty |
|
398 | @LazyProperty | |
399 | def extension(self): |
|
399 | def extension(self): | |
400 | """Returns filenode extension""" |
|
400 | """Returns filenode extension""" | |
401 | return self.name.split('.')[-1] |
|
401 | return self.name.split('.')[-1] | |
402 |
|
402 | |||
403 | def is_executable(self): |
|
403 | def is_executable(self): | |
404 | """ |
|
404 | """ | |
405 | Returns ``True`` if file has executable flag turned on. |
|
405 | Returns ``True`` if file has executable flag turned on. | |
406 | """ |
|
406 | """ | |
407 | return bool(self.mode & stat.S_IXUSR) |
|
407 | return bool(self.mode & stat.S_IXUSR) | |
408 |
|
408 | |||
|
409 | def __repr__(self): | |||
|
410 | return '<%s %r @ %s>' % (self.__class__.__name__, self.path, | |||
|
411 | self.changeset.short_id) | |||
|
412 | ||||
409 |
|
413 | |||
410 | class RemovedFileNode(FileNode): |
|
414 | class RemovedFileNode(FileNode): | |
411 | """ |
|
415 | """ | |
412 | Dummy FileNode class - trying to access any public attribute except path, |
|
416 | Dummy FileNode class - trying to access any public attribute except path, | |
413 | name, kind or state (or methods/attributes checking those two) would raise |
|
417 | name, kind or state (or methods/attributes checking those two) would raise | |
414 | RemovedFileNodeError. |
|
418 | RemovedFileNodeError. | |
415 | """ |
|
419 | """ | |
416 | ALLOWED_ATTRIBUTES = ['name', 'path', 'state', 'is_root', 'is_file', |
|
420 | ALLOWED_ATTRIBUTES = ['name', 'path', 'state', 'is_root', 'is_file', | |
417 | 'is_dir', 'kind', 'added', 'changed', 'not_changed', 'removed'] |
|
421 | 'is_dir', 'kind', 'added', 'changed', 'not_changed', 'removed'] | |
418 |
|
422 | |||
419 | def __init__(self, path): |
|
423 | def __init__(self, path): | |
420 | """ |
|
424 | """ | |
421 | :param path: relative path to the node |
|
425 | :param path: relative path to the node | |
422 | """ |
|
426 | """ | |
423 | super(RemovedFileNode, self).__init__(path=path) |
|
427 | super(RemovedFileNode, self).__init__(path=path) | |
424 |
|
428 | |||
425 | def __getattribute__(self, attr): |
|
429 | def __getattribute__(self, attr): | |
426 | if attr.startswith('_') or attr in RemovedFileNode.ALLOWED_ATTRIBUTES: |
|
430 | if attr.startswith('_') or attr in RemovedFileNode.ALLOWED_ATTRIBUTES: | |
427 | return super(RemovedFileNode, self).__getattribute__(attr) |
|
431 | return super(RemovedFileNode, self).__getattribute__(attr) | |
428 | raise RemovedFileNodeError("Cannot access attribute %s on " |
|
432 | raise RemovedFileNodeError("Cannot access attribute %s on " | |
429 | "RemovedFileNode" % attr) |
|
433 | "RemovedFileNode" % attr) | |
430 |
|
434 | |||
431 | @LazyProperty |
|
435 | @LazyProperty | |
432 | def state(self): |
|
436 | def state(self): | |
433 | return NodeState.REMOVED |
|
437 | return NodeState.REMOVED | |
434 |
|
438 | |||
435 |
|
439 | |||
436 | class DirNode(Node): |
|
440 | class DirNode(Node): | |
437 | """ |
|
441 | """ | |
438 | DirNode stores list of files and directories within this node. |
|
442 | DirNode stores list of files and directories within this node. | |
439 | Nodes may be used standalone but within repository context they |
|
443 | Nodes may be used standalone but within repository context they | |
440 | lazily fetch data within same repositorty's changeset. |
|
444 | lazily fetch data within same repositorty's changeset. | |
441 | """ |
|
445 | """ | |
442 |
|
446 | |||
443 | def __init__(self, path, nodes=(), changeset=None): |
|
447 | def __init__(self, path, nodes=(), changeset=None): | |
444 | """ |
|
448 | """ | |
445 | Only one of ``nodes`` and ``changeset`` may be given. Passing both |
|
449 | Only one of ``nodes`` and ``changeset`` may be given. Passing both | |
446 | would raise ``NodeError`` exception. |
|
450 | would raise ``NodeError`` exception. | |
447 |
|
451 | |||
448 | :param path: relative path to the node |
|
452 | :param path: relative path to the node | |
449 | :param nodes: content may be passed to constructor |
|
453 | :param nodes: content may be passed to constructor | |
450 | :param changeset: if given, will use it to lazily fetch content |
|
454 | :param changeset: if given, will use it to lazily fetch content | |
451 | :param size: always 0 for ``DirNode`` |
|
455 | :param size: always 0 for ``DirNode`` | |
452 | """ |
|
456 | """ | |
453 | if nodes and changeset: |
|
457 | if nodes and changeset: | |
454 | raise NodeError("Cannot use both nodes and changeset") |
|
458 | raise NodeError("Cannot use both nodes and changeset") | |
455 | super(DirNode, self).__init__(path, NodeKind.DIR) |
|
459 | super(DirNode, self).__init__(path, NodeKind.DIR) | |
456 | self.changeset = changeset |
|
460 | self.changeset = changeset | |
457 | self._nodes = nodes |
|
461 | self._nodes = nodes | |
458 |
|
462 | |||
459 | @LazyProperty |
|
463 | @LazyProperty | |
460 | def content(self): |
|
464 | def content(self): | |
461 | raise NodeError("%s represents a dir and has no ``content`` attribute" |
|
465 | raise NodeError("%s represents a dir and has no ``content`` attribute" | |
462 | % self) |
|
466 | % self) | |
463 |
|
467 | |||
464 | @LazyProperty |
|
468 | @LazyProperty | |
465 | def nodes(self): |
|
469 | def nodes(self): | |
466 | if self.changeset: |
|
470 | if self.changeset: | |
467 | nodes = self.changeset.get_nodes(self.path) |
|
471 | nodes = self.changeset.get_nodes(self.path) | |
468 | else: |
|
472 | else: | |
469 | nodes = self._nodes |
|
473 | nodes = self._nodes | |
470 | self._nodes_dict = dict((node.path, node) for node in nodes) |
|
474 | self._nodes_dict = dict((node.path, node) for node in nodes) | |
471 | return sorted(nodes) |
|
475 | return sorted(nodes) | |
472 |
|
476 | |||
473 | @LazyProperty |
|
477 | @LazyProperty | |
474 | def files(self): |
|
478 | def files(self): | |
475 | return sorted((node for node in self.nodes if node.is_file())) |
|
479 | return sorted((node for node in self.nodes if node.is_file())) | |
476 |
|
480 | |||
477 | @LazyProperty |
|
481 | @LazyProperty | |
478 | def dirs(self): |
|
482 | def dirs(self): | |
479 | return sorted((node for node in self.nodes if node.is_dir())) |
|
483 | return sorted((node for node in self.nodes if node.is_dir())) | |
480 |
|
484 | |||
481 | def __iter__(self): |
|
485 | def __iter__(self): | |
482 | for node in self.nodes: |
|
486 | for node in self.nodes: | |
483 | yield node |
|
487 | yield node | |
484 |
|
488 | |||
485 | def get_node(self, path): |
|
489 | def get_node(self, path): | |
486 | """ |
|
490 | """ | |
487 | Returns node from within this particular ``DirNode``, so it is now |
|
491 | Returns node from within this particular ``DirNode``, so it is now | |
488 | allowed to fetch, i.e. node located at 'docs/api/index.rst' from node |
|
492 | allowed to fetch, i.e. node located at 'docs/api/index.rst' from node | |
489 | 'docs'. In order to access deeper nodes one must fetch nodes between |
|
493 | 'docs'. In order to access deeper nodes one must fetch nodes between | |
490 | them first - this would work:: |
|
494 | them first - this would work:: | |
491 |
|
495 | |||
492 | docs = root.get_node('docs') |
|
496 | docs = root.get_node('docs') | |
493 | docs.get_node('api').get_node('index.rst') |
|
497 | docs.get_node('api').get_node('index.rst') | |
494 |
|
498 | |||
495 | :param: path - relative to the current node |
|
499 | :param: path - relative to the current node | |
496 |
|
500 | |||
497 | .. note:: |
|
501 | .. note:: | |
498 | To access lazily (as in example above) node have to be initialized |
|
502 | To access lazily (as in example above) node have to be initialized | |
499 | with related changeset object - without it node is out of |
|
503 | with related changeset object - without it node is out of | |
500 | context and may know nothing about anything else than nearest |
|
504 | context and may know nothing about anything else than nearest | |
501 | (located at same level) nodes. |
|
505 | (located at same level) nodes. | |
502 | """ |
|
506 | """ | |
503 | try: |
|
507 | try: | |
504 | path = path.rstrip('/') |
|
508 | path = path.rstrip('/') | |
505 | if path == '': |
|
509 | if path == '': | |
506 | raise NodeError("Cannot retrieve node without path") |
|
510 | raise NodeError("Cannot retrieve node without path") | |
507 | self.nodes # access nodes first in order to set _nodes_dict |
|
511 | self.nodes # access nodes first in order to set _nodes_dict | |
508 | paths = path.split('/') |
|
512 | paths = path.split('/') | |
509 | if len(paths) == 1: |
|
513 | if len(paths) == 1: | |
510 | if not self.is_root(): |
|
514 | if not self.is_root(): | |
511 | path = '/'.join((self.path, paths[0])) |
|
515 | path = '/'.join((self.path, paths[0])) | |
512 | else: |
|
516 | else: | |
513 | path = paths[0] |
|
517 | path = paths[0] | |
514 | return self._nodes_dict[path] |
|
518 | return self._nodes_dict[path] | |
515 | elif len(paths) > 1: |
|
519 | elif len(paths) > 1: | |
516 | if self.changeset is None: |
|
520 | if self.changeset is None: | |
517 | raise NodeError("Cannot access deeper " |
|
521 | raise NodeError("Cannot access deeper " | |
518 | "nodes without changeset") |
|
522 | "nodes without changeset") | |
519 | else: |
|
523 | else: | |
520 | path1, path2 = paths[0], '/'.join(paths[1:]) |
|
524 | path1, path2 = paths[0], '/'.join(paths[1:]) | |
521 | return self.get_node(path1).get_node(path2) |
|
525 | return self.get_node(path1).get_node(path2) | |
522 | else: |
|
526 | else: | |
523 | raise KeyError |
|
527 | raise KeyError | |
524 | except KeyError: |
|
528 | except KeyError: | |
525 | raise NodeError("Node does not exist at %s" % path) |
|
529 | raise NodeError("Node does not exist at %s" % path) | |
526 |
|
530 | |||
527 | @LazyProperty |
|
531 | @LazyProperty | |
528 | def state(self): |
|
532 | def state(self): | |
529 | raise NodeError("Cannot access state of DirNode") |
|
533 | raise NodeError("Cannot access state of DirNode") | |
530 |
|
534 | |||
531 | @LazyProperty |
|
535 | @LazyProperty | |
532 | def size(self): |
|
536 | def size(self): | |
533 | size = 0 |
|
537 | size = 0 | |
534 | for root, dirs, files in self.changeset.walk(self.path): |
|
538 | for root, dirs, files in self.changeset.walk(self.path): | |
535 | for f in files: |
|
539 | for f in files: | |
536 | size += f.size |
|
540 | size += f.size | |
537 |
|
541 | |||
538 | return size |
|
542 | return size | |
539 |
|
543 | |||
|
544 | def __repr__(self): | |||
|
545 | return '<%s %r @ %s>' % (self.__class__.__name__, self.path, | |||
|
546 | self.changeset.short_id) | |||
|
547 | ||||
540 |
|
548 | |||
541 | class RootNode(DirNode): |
|
549 | class RootNode(DirNode): | |
542 | """ |
|
550 | """ | |
543 | DirNode being the root node of the repository. |
|
551 | DirNode being the root node of the repository. | |
544 | """ |
|
552 | """ | |
545 |
|
553 | |||
546 | def __init__(self, nodes=(), changeset=None): |
|
554 | def __init__(self, nodes=(), changeset=None): | |
547 | super(RootNode, self).__init__(path='', nodes=nodes, |
|
555 | super(RootNode, self).__init__(path='', nodes=nodes, | |
548 | changeset=changeset) |
|
556 | changeset=changeset) | |
549 |
|
557 | |||
550 | def __repr__(self): |
|
558 | def __repr__(self): | |
551 | return '<%s>' % self.__class__.__name__ |
|
559 | return '<%s>' % self.__class__.__name__ |
@@ -1,460 +1,460 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # original copyright: 2007-2008 by Armin Ronacher |
|
2 | # original copyright: 2007-2008 by Armin Ronacher | |
3 | # licensed under the BSD license. |
|
3 | # licensed under the BSD license. | |
4 |
|
4 | |||
5 | import re |
|
5 | import re | |
6 | import difflib |
|
6 | import difflib | |
7 | import logging |
|
7 | import logging | |
8 |
|
8 | |||
9 | from difflib import unified_diff |
|
9 | from difflib import unified_diff | |
10 | from itertools import tee, imap |
|
10 | from itertools import tee, imap | |
11 |
|
11 | |||
12 | from mercurial.match import match |
|
12 | from mercurial.match import match | |
13 |
|
13 | |||
14 | from rhodecode.lib.vcs.exceptions import VCSError |
|
14 | from rhodecode.lib.vcs.exceptions import VCSError | |
15 | from rhodecode.lib.vcs.nodes import FileNode, NodeError |
|
15 | from rhodecode.lib.vcs.nodes import FileNode, NodeError | |
16 |
|
16 | |||
17 |
|
17 | |||
18 | def get_udiff(filenode_old, filenode_new,show_whitespace=True): |
|
18 | def get_udiff(filenode_old, filenode_new, show_whitespace=True): | |
19 | """ |
|
19 | """ | |
20 | Returns unified diff between given ``filenode_old`` and ``filenode_new``. |
|
20 | Returns unified diff between given ``filenode_old`` and ``filenode_new``. | |
21 | """ |
|
21 | """ | |
22 | try: |
|
22 | try: | |
23 |
filenode_old_date = filenode_old. |
|
23 | filenode_old_date = filenode_old.changeset.date | |
24 | except NodeError: |
|
24 | except NodeError: | |
25 | filenode_old_date = None |
|
25 | filenode_old_date = None | |
26 |
|
26 | |||
27 | try: |
|
27 | try: | |
28 |
filenode_new_date = filenode_new. |
|
28 | filenode_new_date = filenode_new.changeset.date | |
29 | except NodeError: |
|
29 | except NodeError: | |
30 | filenode_new_date = None |
|
30 | filenode_new_date = None | |
31 |
|
31 | |||
32 | for filenode in (filenode_old, filenode_new): |
|
32 | for filenode in (filenode_old, filenode_new): | |
33 | if not isinstance(filenode, FileNode): |
|
33 | if not isinstance(filenode, FileNode): | |
34 | raise VCSError("Given object should be FileNode object, not %s" |
|
34 | raise VCSError("Given object should be FileNode object, not %s" | |
35 | % filenode.__class__) |
|
35 | % filenode.__class__) | |
36 |
|
36 | |||
37 | if filenode_old_date and filenode_new_date: |
|
37 | if filenode_old_date and filenode_new_date: | |
38 | if not filenode_old_date < filenode_new_date: |
|
38 | if not filenode_old_date < filenode_new_date: | |
39 | logging.debug("Generating udiff for filenodes with not increasing " |
|
39 | logging.debug("Generating udiff for filenodes with not increasing " | |
40 | "dates") |
|
40 | "dates") | |
41 |
|
41 | |||
42 | vcs_udiff = unified_diff(filenode_old.content.splitlines(True), |
|
42 | vcs_udiff = unified_diff(filenode_old.content.splitlines(True), | |
43 | filenode_new.content.splitlines(True), |
|
43 | filenode_new.content.splitlines(True), | |
44 | filenode_old.name, |
|
44 | filenode_old.name, | |
45 | filenode_new.name, |
|
45 | filenode_new.name, | |
46 | filenode_old_date, |
|
46 | filenode_old_date, | |
47 | filenode_old_date) |
|
47 | filenode_old_date) | |
48 | return vcs_udiff |
|
48 | return vcs_udiff | |
49 |
|
49 | |||
50 |
|
50 | |||
51 | def get_gitdiff(filenode_old, filenode_new, ignore_whitespace=True): |
|
51 | def get_gitdiff(filenode_old, filenode_new, ignore_whitespace=True): | |
52 | """ |
|
52 | """ | |
53 | Returns git style diff between given ``filenode_old`` and ``filenode_new``. |
|
53 | Returns git style diff between given ``filenode_old`` and ``filenode_new``. | |
54 |
|
54 | |||
55 | :param ignore_whitespace: ignore whitespaces in diff |
|
55 | :param ignore_whitespace: ignore whitespaces in diff | |
56 | """ |
|
56 | """ | |
57 |
|
57 | |||
58 | for filenode in (filenode_old, filenode_new): |
|
58 | for filenode in (filenode_old, filenode_new): | |
59 | if not isinstance(filenode, FileNode): |
|
59 | if not isinstance(filenode, FileNode): | |
60 | raise VCSError("Given object should be FileNode object, not %s" |
|
60 | raise VCSError("Given object should be FileNode object, not %s" | |
61 | % filenode.__class__) |
|
61 | % filenode.__class__) | |
62 |
|
62 | |||
63 | old_raw_id = getattr(filenode_old.changeset, 'raw_id', '0' * 40) |
|
63 | old_raw_id = getattr(filenode_old.changeset, 'raw_id', '0' * 40) | |
64 | new_raw_id = getattr(filenode_new.changeset, 'raw_id', '0' * 40) |
|
64 | new_raw_id = getattr(filenode_new.changeset, 'raw_id', '0' * 40) | |
65 |
|
65 | |||
66 | repo = filenode_new.changeset.repository |
|
66 | repo = filenode_new.changeset.repository | |
67 | vcs_gitdiff = repo._get_diff(old_raw_id, new_raw_id, filenode_new.path, |
|
67 | vcs_gitdiff = repo._get_diff(old_raw_id, new_raw_id, filenode_new.path, | |
68 | ignore_whitespace) |
|
68 | ignore_whitespace) | |
69 |
|
69 | |||
70 | return vcs_gitdiff |
|
70 | return vcs_gitdiff | |
71 |
|
71 | |||
72 |
|
72 | |||
73 | class DiffProcessor(object): |
|
73 | class DiffProcessor(object): | |
74 | """ |
|
74 | """ | |
75 | Give it a unified diff and it returns a list of the files that were |
|
75 | Give it a unified diff and it returns a list of the files that were | |
76 | mentioned in the diff together with a dict of meta information that |
|
76 | mentioned in the diff together with a dict of meta information that | |
77 | can be used to render it in a HTML template. |
|
77 | can be used to render it in a HTML template. | |
78 | """ |
|
78 | """ | |
79 | _chunk_re = re.compile(r'@@ -(\d+)(?:,(\d+))? \+(\d+)(?:,(\d+))? @@(.*)') |
|
79 | _chunk_re = re.compile(r'@@ -(\d+)(?:,(\d+))? \+(\d+)(?:,(\d+))? @@(.*)') | |
80 |
|
80 | |||
81 | def __init__(self, diff, differ='diff', format='udiff'): |
|
81 | def __init__(self, diff, differ='diff', format='udiff'): | |
82 | """ |
|
82 | """ | |
83 | :param diff: a text in diff format or generator |
|
83 | :param diff: a text in diff format or generator | |
84 | :param format: format of diff passed, `udiff` or `gitdiff` |
|
84 | :param format: format of diff passed, `udiff` or `gitdiff` | |
85 | """ |
|
85 | """ | |
86 | if isinstance(diff, basestring): |
|
86 | if isinstance(diff, basestring): | |
87 | diff = [diff] |
|
87 | diff = [diff] | |
88 |
|
88 | |||
89 | self.__udiff = diff |
|
89 | self.__udiff = diff | |
90 | self.__format = format |
|
90 | self.__format = format | |
91 | self.adds = 0 |
|
91 | self.adds = 0 | |
92 | self.removes = 0 |
|
92 | self.removes = 0 | |
93 |
|
93 | |||
94 | if isinstance(self.__udiff, basestring): |
|
94 | if isinstance(self.__udiff, basestring): | |
95 | self.lines = iter(self.__udiff.splitlines(1)) |
|
95 | self.lines = iter(self.__udiff.splitlines(1)) | |
96 |
|
96 | |||
97 | elif self.__format == 'gitdiff': |
|
97 | elif self.__format == 'gitdiff': | |
98 | udiff_copy = self.copy_iterator() |
|
98 | udiff_copy = self.copy_iterator() | |
99 | self.lines = imap(self.escaper, self._parse_gitdiff(udiff_copy)) |
|
99 | self.lines = imap(self.escaper, self._parse_gitdiff(udiff_copy)) | |
100 | else: |
|
100 | else: | |
101 | udiff_copy = self.copy_iterator() |
|
101 | udiff_copy = self.copy_iterator() | |
102 | self.lines = imap(self.escaper, udiff_copy) |
|
102 | self.lines = imap(self.escaper, udiff_copy) | |
103 |
|
103 | |||
104 | # Select a differ. |
|
104 | # Select a differ. | |
105 | if differ == 'difflib': |
|
105 | if differ == 'difflib': | |
106 | self.differ = self._highlight_line_difflib |
|
106 | self.differ = self._highlight_line_difflib | |
107 | else: |
|
107 | else: | |
108 | self.differ = self._highlight_line_udiff |
|
108 | self.differ = self._highlight_line_udiff | |
109 |
|
109 | |||
110 | def escaper(self, string): |
|
110 | def escaper(self, string): | |
111 | return string.replace('<', '<').replace('>', '>') |
|
111 | return string.replace('<', '<').replace('>', '>') | |
112 |
|
112 | |||
113 | def copy_iterator(self): |
|
113 | def copy_iterator(self): | |
114 | """ |
|
114 | """ | |
115 | make a fresh copy of generator, we should not iterate thru |
|
115 | make a fresh copy of generator, we should not iterate thru | |
116 | an original as it's needed for repeating operations on |
|
116 | an original as it's needed for repeating operations on | |
117 | this instance of DiffProcessor |
|
117 | this instance of DiffProcessor | |
118 | """ |
|
118 | """ | |
119 | self.__udiff, iterator_copy = tee(self.__udiff) |
|
119 | self.__udiff, iterator_copy = tee(self.__udiff) | |
120 | return iterator_copy |
|
120 | return iterator_copy | |
121 |
|
121 | |||
122 | def _extract_rev(self, line1, line2): |
|
122 | def _extract_rev(self, line1, line2): | |
123 | """ |
|
123 | """ | |
124 | Extract the filename and revision hint from a line. |
|
124 | Extract the filename and revision hint from a line. | |
125 | """ |
|
125 | """ | |
126 |
|
126 | |||
127 | try: |
|
127 | try: | |
128 | if line1.startswith('--- ') and line2.startswith('+++ '): |
|
128 | if line1.startswith('--- ') and line2.startswith('+++ '): | |
129 | l1 = line1[4:].split(None, 1) |
|
129 | l1 = line1[4:].split(None, 1) | |
130 | old_filename = l1[0].lstrip('a/') if len(l1) >= 1 else None |
|
130 | old_filename = l1[0].lstrip('a/') if len(l1) >= 1 else None | |
131 | old_rev = l1[1] if len(l1) == 2 else 'old' |
|
131 | old_rev = l1[1] if len(l1) == 2 else 'old' | |
132 |
|
132 | |||
133 | l2 = line2[4:].split(None, 1) |
|
133 | l2 = line2[4:].split(None, 1) | |
134 | new_filename = l2[0].lstrip('b/') if len(l1) >= 1 else None |
|
134 | new_filename = l2[0].lstrip('b/') if len(l1) >= 1 else None | |
135 | new_rev = l2[1] if len(l2) == 2 else 'new' |
|
135 | new_rev = l2[1] if len(l2) == 2 else 'new' | |
136 |
|
136 | |||
137 | filename = old_filename if (old_filename != |
|
137 | filename = old_filename if (old_filename != | |
138 | 'dev/null') else new_filename |
|
138 | 'dev/null') else new_filename | |
139 |
|
139 | |||
140 | return filename, new_rev, old_rev |
|
140 | return filename, new_rev, old_rev | |
141 | except (ValueError, IndexError): |
|
141 | except (ValueError, IndexError): | |
142 | pass |
|
142 | pass | |
143 |
|
143 | |||
144 | return None, None, None |
|
144 | return None, None, None | |
145 |
|
145 | |||
146 | def _parse_gitdiff(self, diffiterator): |
|
146 | def _parse_gitdiff(self, diffiterator): | |
147 | def line_decoder(l): |
|
147 | def line_decoder(l): | |
148 | if l.startswith('+') and not l.startswith('+++'): |
|
148 | if l.startswith('+') and not l.startswith('+++'): | |
149 | self.adds += 1 |
|
149 | self.adds += 1 | |
150 | elif l.startswith('-') and not l.startswith('---'): |
|
150 | elif l.startswith('-') and not l.startswith('---'): | |
151 | self.removes += 1 |
|
151 | self.removes += 1 | |
152 | return l.decode('utf8', 'replace') |
|
152 | return l.decode('utf8', 'replace') | |
153 |
|
153 | |||
154 | output = list(diffiterator) |
|
154 | output = list(diffiterator) | |
155 | size = len(output) |
|
155 | size = len(output) | |
156 |
|
156 | |||
157 | if size == 2: |
|
157 | if size == 2: | |
158 | l = [] |
|
158 | l = [] | |
159 | l.extend([output[0]]) |
|
159 | l.extend([output[0]]) | |
160 | l.extend(output[1].splitlines(1)) |
|
160 | l.extend(output[1].splitlines(1)) | |
161 | return map(line_decoder, l) |
|
161 | return map(line_decoder, l) | |
162 | elif size == 1: |
|
162 | elif size == 1: | |
163 | return map(line_decoder, output[0].splitlines(1)) |
|
163 | return map(line_decoder, output[0].splitlines(1)) | |
164 | elif size == 0: |
|
164 | elif size == 0: | |
165 | return [] |
|
165 | return [] | |
166 |
|
166 | |||
167 | raise Exception('wrong size of diff %s' % size) |
|
167 | raise Exception('wrong size of diff %s' % size) | |
168 |
|
168 | |||
169 | def _highlight_line_difflib(self, line, next): |
|
169 | def _highlight_line_difflib(self, line, next): | |
170 | """ |
|
170 | """ | |
171 | Highlight inline changes in both lines. |
|
171 | Highlight inline changes in both lines. | |
172 | """ |
|
172 | """ | |
173 |
|
173 | |||
174 | if line['action'] == 'del': |
|
174 | if line['action'] == 'del': | |
175 | old, new = line, next |
|
175 | old, new = line, next | |
176 | else: |
|
176 | else: | |
177 | old, new = next, line |
|
177 | old, new = next, line | |
178 |
|
178 | |||
179 | oldwords = re.split(r'(\W)', old['line']) |
|
179 | oldwords = re.split(r'(\W)', old['line']) | |
180 | newwords = re.split(r'(\W)', new['line']) |
|
180 | newwords = re.split(r'(\W)', new['line']) | |
181 |
|
181 | |||
182 | sequence = difflib.SequenceMatcher(None, oldwords, newwords) |
|
182 | sequence = difflib.SequenceMatcher(None, oldwords, newwords) | |
183 |
|
183 | |||
184 | oldfragments, newfragments = [], [] |
|
184 | oldfragments, newfragments = [], [] | |
185 | for tag, i1, i2, j1, j2 in sequence.get_opcodes(): |
|
185 | for tag, i1, i2, j1, j2 in sequence.get_opcodes(): | |
186 | oldfrag = ''.join(oldwords[i1:i2]) |
|
186 | oldfrag = ''.join(oldwords[i1:i2]) | |
187 | newfrag = ''.join(newwords[j1:j2]) |
|
187 | newfrag = ''.join(newwords[j1:j2]) | |
188 | if tag != 'equal': |
|
188 | if tag != 'equal': | |
189 | if oldfrag: |
|
189 | if oldfrag: | |
190 | oldfrag = '<del>%s</del>' % oldfrag |
|
190 | oldfrag = '<del>%s</del>' % oldfrag | |
191 | if newfrag: |
|
191 | if newfrag: | |
192 | newfrag = '<ins>%s</ins>' % newfrag |
|
192 | newfrag = '<ins>%s</ins>' % newfrag | |
193 | oldfragments.append(oldfrag) |
|
193 | oldfragments.append(oldfrag) | |
194 | newfragments.append(newfrag) |
|
194 | newfragments.append(newfrag) | |
195 |
|
195 | |||
196 | old['line'] = "".join(oldfragments) |
|
196 | old['line'] = "".join(oldfragments) | |
197 | new['line'] = "".join(newfragments) |
|
197 | new['line'] = "".join(newfragments) | |
198 |
|
198 | |||
199 | def _highlight_line_udiff(self, line, next): |
|
199 | def _highlight_line_udiff(self, line, next): | |
200 | """ |
|
200 | """ | |
201 | Highlight inline changes in both lines. |
|
201 | Highlight inline changes in both lines. | |
202 | """ |
|
202 | """ | |
203 | start = 0 |
|
203 | start = 0 | |
204 | limit = min(len(line['line']), len(next['line'])) |
|
204 | limit = min(len(line['line']), len(next['line'])) | |
205 | while start < limit and line['line'][start] == next['line'][start]: |
|
205 | while start < limit and line['line'][start] == next['line'][start]: | |
206 | start += 1 |
|
206 | start += 1 | |
207 | end = -1 |
|
207 | end = -1 | |
208 | limit -= start |
|
208 | limit -= start | |
209 | while -end <= limit and line['line'][end] == next['line'][end]: |
|
209 | while -end <= limit and line['line'][end] == next['line'][end]: | |
210 | end -= 1 |
|
210 | end -= 1 | |
211 | end += 1 |
|
211 | end += 1 | |
212 | if start or end: |
|
212 | if start or end: | |
213 | def do(l): |
|
213 | def do(l): | |
214 | last = end + len(l['line']) |
|
214 | last = end + len(l['line']) | |
215 | if l['action'] == 'add': |
|
215 | if l['action'] == 'add': | |
216 | tag = 'ins' |
|
216 | tag = 'ins' | |
217 | else: |
|
217 | else: | |
218 | tag = 'del' |
|
218 | tag = 'del' | |
219 | l['line'] = '%s<%s>%s</%s>%s' % ( |
|
219 | l['line'] = '%s<%s>%s</%s>%s' % ( | |
220 | l['line'][:start], |
|
220 | l['line'][:start], | |
221 | tag, |
|
221 | tag, | |
222 | l['line'][start:last], |
|
222 | l['line'][start:last], | |
223 | tag, |
|
223 | tag, | |
224 | l['line'][last:] |
|
224 | l['line'][last:] | |
225 | ) |
|
225 | ) | |
226 | do(line) |
|
226 | do(line) | |
227 | do(next) |
|
227 | do(next) | |
228 |
|
228 | |||
229 | def _parse_udiff(self): |
|
229 | def _parse_udiff(self): | |
230 | """ |
|
230 | """ | |
231 | Parse the diff an return data for the template. |
|
231 | Parse the diff an return data for the template. | |
232 | """ |
|
232 | """ | |
233 | lineiter = self.lines |
|
233 | lineiter = self.lines | |
234 | files = [] |
|
234 | files = [] | |
235 | try: |
|
235 | try: | |
236 | line = lineiter.next() |
|
236 | line = lineiter.next() | |
237 | # skip first context |
|
237 | # skip first context | |
238 | skipfirst = True |
|
238 | skipfirst = True | |
239 | while 1: |
|
239 | while 1: | |
240 | # continue until we found the old file |
|
240 | # continue until we found the old file | |
241 | if not line.startswith('--- '): |
|
241 | if not line.startswith('--- '): | |
242 | line = lineiter.next() |
|
242 | line = lineiter.next() | |
243 | continue |
|
243 | continue | |
244 |
|
244 | |||
245 | chunks = [] |
|
245 | chunks = [] | |
246 | filename, old_rev, new_rev = \ |
|
246 | filename, old_rev, new_rev = \ | |
247 | self._extract_rev(line, lineiter.next()) |
|
247 | self._extract_rev(line, lineiter.next()) | |
248 | files.append({ |
|
248 | files.append({ | |
249 | 'filename': filename, |
|
249 | 'filename': filename, | |
250 | 'old_revision': old_rev, |
|
250 | 'old_revision': old_rev, | |
251 | 'new_revision': new_rev, |
|
251 | 'new_revision': new_rev, | |
252 | 'chunks': chunks |
|
252 | 'chunks': chunks | |
253 | }) |
|
253 | }) | |
254 |
|
254 | |||
255 | line = lineiter.next() |
|
255 | line = lineiter.next() | |
256 | while line: |
|
256 | while line: | |
257 | match = self._chunk_re.match(line) |
|
257 | match = self._chunk_re.match(line) | |
258 | if not match: |
|
258 | if not match: | |
259 | break |
|
259 | break | |
260 |
|
260 | |||
261 | lines = [] |
|
261 | lines = [] | |
262 | chunks.append(lines) |
|
262 | chunks.append(lines) | |
263 |
|
263 | |||
264 | old_line, old_end, new_line, new_end = \ |
|
264 | old_line, old_end, new_line, new_end = \ | |
265 | [int(x or 1) for x in match.groups()[:-1]] |
|
265 | [int(x or 1) for x in match.groups()[:-1]] | |
266 | old_line -= 1 |
|
266 | old_line -= 1 | |
267 | new_line -= 1 |
|
267 | new_line -= 1 | |
268 | context = len(match.groups()) == 5 |
|
268 | context = len(match.groups()) == 5 | |
269 | old_end += old_line |
|
269 | old_end += old_line | |
270 | new_end += new_line |
|
270 | new_end += new_line | |
271 |
|
271 | |||
272 | if context: |
|
272 | if context: | |
273 | if not skipfirst: |
|
273 | if not skipfirst: | |
274 | lines.append({ |
|
274 | lines.append({ | |
275 | 'old_lineno': '...', |
|
275 | 'old_lineno': '...', | |
276 | 'new_lineno': '...', |
|
276 | 'new_lineno': '...', | |
277 | 'action': 'context', |
|
277 | 'action': 'context', | |
278 | 'line': line, |
|
278 | 'line': line, | |
279 | }) |
|
279 | }) | |
280 | else: |
|
280 | else: | |
281 | skipfirst = False |
|
281 | skipfirst = False | |
282 |
|
282 | |||
283 | line = lineiter.next() |
|
283 | line = lineiter.next() | |
284 | while old_line < old_end or new_line < new_end: |
|
284 | while old_line < old_end or new_line < new_end: | |
285 | if line: |
|
285 | if line: | |
286 | command, line = line[0], line[1:] |
|
286 | command, line = line[0], line[1:] | |
287 | else: |
|
287 | else: | |
288 | command = ' ' |
|
288 | command = ' ' | |
289 | affects_old = affects_new = False |
|
289 | affects_old = affects_new = False | |
290 |
|
290 | |||
291 | # ignore those if we don't expect them |
|
291 | # ignore those if we don't expect them | |
292 | if command in '#@': |
|
292 | if command in '#@': | |
293 | continue |
|
293 | continue | |
294 | elif command == '+': |
|
294 | elif command == '+': | |
295 | affects_new = True |
|
295 | affects_new = True | |
296 | action = 'add' |
|
296 | action = 'add' | |
297 | elif command == '-': |
|
297 | elif command == '-': | |
298 | affects_old = True |
|
298 | affects_old = True | |
299 | action = 'del' |
|
299 | action = 'del' | |
300 | else: |
|
300 | else: | |
301 | affects_old = affects_new = True |
|
301 | affects_old = affects_new = True | |
302 | action = 'unmod' |
|
302 | action = 'unmod' | |
303 |
|
303 | |||
304 | old_line += affects_old |
|
304 | old_line += affects_old | |
305 | new_line += affects_new |
|
305 | new_line += affects_new | |
306 | lines.append({ |
|
306 | lines.append({ | |
307 | 'old_lineno': affects_old and old_line or '', |
|
307 | 'old_lineno': affects_old and old_line or '', | |
308 | 'new_lineno': affects_new and new_line or '', |
|
308 | 'new_lineno': affects_new and new_line or '', | |
309 | 'action': action, |
|
309 | 'action': action, | |
310 | 'line': line |
|
310 | 'line': line | |
311 | }) |
|
311 | }) | |
312 | line = lineiter.next() |
|
312 | line = lineiter.next() | |
313 |
|
313 | |||
314 | except StopIteration: |
|
314 | except StopIteration: | |
315 | pass |
|
315 | pass | |
316 |
|
316 | |||
317 | # highlight inline changes |
|
317 | # highlight inline changes | |
318 | for file in files: |
|
318 | for file in files: | |
319 | for chunk in chunks: |
|
319 | for chunk in chunks: | |
320 | lineiter = iter(chunk) |
|
320 | lineiter = iter(chunk) | |
321 | #first = True |
|
321 | #first = True | |
322 | try: |
|
322 | try: | |
323 | while 1: |
|
323 | while 1: | |
324 | line = lineiter.next() |
|
324 | line = lineiter.next() | |
325 | if line['action'] != 'unmod': |
|
325 | if line['action'] != 'unmod': | |
326 | nextline = lineiter.next() |
|
326 | nextline = lineiter.next() | |
327 | if nextline['action'] == 'unmod' or \ |
|
327 | if nextline['action'] == 'unmod' or \ | |
328 | nextline['action'] == line['action']: |
|
328 | nextline['action'] == line['action']: | |
329 | continue |
|
329 | continue | |
330 | self.differ(line, nextline) |
|
330 | self.differ(line, nextline) | |
331 | except StopIteration: |
|
331 | except StopIteration: | |
332 | pass |
|
332 | pass | |
333 |
|
333 | |||
334 | return files |
|
334 | return files | |
335 |
|
335 | |||
336 | def prepare(self): |
|
336 | def prepare(self): | |
337 | """ |
|
337 | """ | |
338 | Prepare the passed udiff for HTML rendering. It'l return a list |
|
338 | Prepare the passed udiff for HTML rendering. It'l return a list | |
339 | of dicts |
|
339 | of dicts | |
340 | """ |
|
340 | """ | |
341 | return self._parse_udiff() |
|
341 | return self._parse_udiff() | |
342 |
|
342 | |||
343 | def _safe_id(self, idstring): |
|
343 | def _safe_id(self, idstring): | |
344 | """Make a string safe for including in an id attribute. |
|
344 | """Make a string safe for including in an id attribute. | |
345 |
|
345 | |||
346 | The HTML spec says that id attributes 'must begin with |
|
346 | The HTML spec says that id attributes 'must begin with | |
347 | a letter ([A-Za-z]) and may be followed by any number |
|
347 | a letter ([A-Za-z]) and may be followed by any number | |
348 | of letters, digits ([0-9]), hyphens ("-"), underscores |
|
348 | of letters, digits ([0-9]), hyphens ("-"), underscores | |
349 | ("_"), colons (":"), and periods (".")'. These regexps |
|
349 | ("_"), colons (":"), and periods (".")'. These regexps | |
350 | are slightly over-zealous, in that they remove colons |
|
350 | are slightly over-zealous, in that they remove colons | |
351 | and periods unnecessarily. |
|
351 | and periods unnecessarily. | |
352 |
|
352 | |||
353 | Whitespace is transformed into underscores, and then |
|
353 | Whitespace is transformed into underscores, and then | |
354 | anything which is not a hyphen or a character that |
|
354 | anything which is not a hyphen or a character that | |
355 | matches \w (alphanumerics and underscore) is removed. |
|
355 | matches \w (alphanumerics and underscore) is removed. | |
356 |
|
356 | |||
357 | """ |
|
357 | """ | |
358 | # Transform all whitespace to underscore |
|
358 | # Transform all whitespace to underscore | |
359 | idstring = re.sub(r'\s', "_", '%s' % idstring) |
|
359 | idstring = re.sub(r'\s', "_", '%s' % idstring) | |
360 | # Remove everything that is not a hyphen or a member of \w |
|
360 | # Remove everything that is not a hyphen or a member of \w | |
361 | idstring = re.sub(r'(?!-)\W', "", idstring).lower() |
|
361 | idstring = re.sub(r'(?!-)\W', "", idstring).lower() | |
362 | return idstring |
|
362 | return idstring | |
363 |
|
363 | |||
364 | def raw_diff(self): |
|
364 | def raw_diff(self): | |
365 | """ |
|
365 | """ | |
366 | Returns raw string as udiff |
|
366 | Returns raw string as udiff | |
367 | """ |
|
367 | """ | |
368 | udiff_copy = self.copy_iterator() |
|
368 | udiff_copy = self.copy_iterator() | |
369 | if self.__format == 'gitdiff': |
|
369 | if self.__format == 'gitdiff': | |
370 | udiff_copy = self._parse_gitdiff(udiff_copy) |
|
370 | udiff_copy = self._parse_gitdiff(udiff_copy) | |
371 | return u''.join(udiff_copy) |
|
371 | return u''.join(udiff_copy) | |
372 |
|
372 | |||
373 | def as_html(self, table_class='code-difftable', line_class='line', |
|
373 | def as_html(self, table_class='code-difftable', line_class='line', | |
374 | new_lineno_class='lineno old', old_lineno_class='lineno new', |
|
374 | new_lineno_class='lineno old', old_lineno_class='lineno new', | |
375 | code_class='code'): |
|
375 | code_class='code'): | |
376 | """ |
|
376 | """ | |
377 | Return udiff as html table with customized css classes |
|
377 | Return udiff as html table with customized css classes | |
378 | """ |
|
378 | """ | |
379 | def _link_to_if(condition, label, url): |
|
379 | def _link_to_if(condition, label, url): | |
380 | """ |
|
380 | """ | |
381 | Generates a link if condition is meet or just the label if not. |
|
381 | Generates a link if condition is meet or just the label if not. | |
382 | """ |
|
382 | """ | |
383 |
|
383 | |||
384 | if condition: |
|
384 | if condition: | |
385 | return '''<a href="%(url)s">%(label)s</a>''' % {'url': url, |
|
385 | return '''<a href="%(url)s">%(label)s</a>''' % {'url': url, | |
386 | 'label': label} |
|
386 | 'label': label} | |
387 | else: |
|
387 | else: | |
388 | return label |
|
388 | return label | |
389 | diff_lines = self.prepare() |
|
389 | diff_lines = self.prepare() | |
390 | _html_empty = True |
|
390 | _html_empty = True | |
391 | _html = [] |
|
391 | _html = [] | |
392 | _html.append('''<table class="%(table_class)s">\n''' \ |
|
392 | _html.append('''<table class="%(table_class)s">\n''' \ | |
393 | % {'table_class': table_class}) |
|
393 | % {'table_class': table_class}) | |
394 | for diff in diff_lines: |
|
394 | for diff in diff_lines: | |
395 | for line in diff['chunks']: |
|
395 | for line in diff['chunks']: | |
396 | _html_empty = False |
|
396 | _html_empty = False | |
397 | for change in line: |
|
397 | for change in line: | |
398 | _html.append('''<tr class="%(line_class)s %(action)s">\n''' \ |
|
398 | _html.append('''<tr class="%(line_class)s %(action)s">\n''' \ | |
399 | % {'line_class': line_class, |
|
399 | % {'line_class': line_class, | |
400 | 'action': change['action']}) |
|
400 | 'action': change['action']}) | |
401 | anchor_old_id = '' |
|
401 | anchor_old_id = '' | |
402 | anchor_new_id = '' |
|
402 | anchor_new_id = '' | |
403 | anchor_old = "%(filename)s_o%(oldline_no)s" % \ |
|
403 | anchor_old = "%(filename)s_o%(oldline_no)s" % \ | |
404 | {'filename': self._safe_id(diff['filename']), |
|
404 | {'filename': self._safe_id(diff['filename']), | |
405 | 'oldline_no': change['old_lineno']} |
|
405 | 'oldline_no': change['old_lineno']} | |
406 | anchor_new = "%(filename)s_n%(oldline_no)s" % \ |
|
406 | anchor_new = "%(filename)s_n%(oldline_no)s" % \ | |
407 | {'filename': self._safe_id(diff['filename']), |
|
407 | {'filename': self._safe_id(diff['filename']), | |
408 | 'oldline_no': change['new_lineno']} |
|
408 | 'oldline_no': change['new_lineno']} | |
409 | cond_old = change['old_lineno'] != '...' and \ |
|
409 | cond_old = change['old_lineno'] != '...' and \ | |
410 | change['old_lineno'] |
|
410 | change['old_lineno'] | |
411 | cond_new = change['new_lineno'] != '...' and \ |
|
411 | cond_new = change['new_lineno'] != '...' and \ | |
412 | change['new_lineno'] |
|
412 | change['new_lineno'] | |
413 | if cond_old: |
|
413 | if cond_old: | |
414 | anchor_old_id = 'id="%s"' % anchor_old |
|
414 | anchor_old_id = 'id="%s"' % anchor_old | |
415 | if cond_new: |
|
415 | if cond_new: | |
416 | anchor_new_id = 'id="%s"' % anchor_new |
|
416 | anchor_new_id = 'id="%s"' % anchor_new | |
417 | ########################################################### |
|
417 | ########################################################### | |
418 | # OLD LINE NUMBER |
|
418 | # OLD LINE NUMBER | |
419 | ########################################################### |
|
419 | ########################################################### | |
420 | _html.append('''\t<td %(a_id)s class="%(old_lineno_cls)s">''' \ |
|
420 | _html.append('''\t<td %(a_id)s class="%(old_lineno_cls)s">''' \ | |
421 | % {'a_id': anchor_old_id, |
|
421 | % {'a_id': anchor_old_id, | |
422 | 'old_lineno_cls': old_lineno_class}) |
|
422 | 'old_lineno_cls': old_lineno_class}) | |
423 |
|
423 | |||
424 | _html.append('''<pre>%(link)s</pre>''' \ |
|
424 | _html.append('''<pre>%(link)s</pre>''' \ | |
425 | % {'link': |
|
425 | % {'link': | |
426 | _link_to_if(cond_old, change['old_lineno'], '#%s' \ |
|
426 | _link_to_if(cond_old, change['old_lineno'], '#%s' \ | |
427 | % anchor_old)}) |
|
427 | % anchor_old)}) | |
428 | _html.append('''</td>\n''') |
|
428 | _html.append('''</td>\n''') | |
429 | ########################################################### |
|
429 | ########################################################### | |
430 | # NEW LINE NUMBER |
|
430 | # NEW LINE NUMBER | |
431 | ########################################################### |
|
431 | ########################################################### | |
432 |
|
432 | |||
433 | _html.append('''\t<td %(a_id)s class="%(new_lineno_cls)s">''' \ |
|
433 | _html.append('''\t<td %(a_id)s class="%(new_lineno_cls)s">''' \ | |
434 | % {'a_id': anchor_new_id, |
|
434 | % {'a_id': anchor_new_id, | |
435 | 'new_lineno_cls': new_lineno_class}) |
|
435 | 'new_lineno_cls': new_lineno_class}) | |
436 |
|
436 | |||
437 | _html.append('''<pre>%(link)s</pre>''' \ |
|
437 | _html.append('''<pre>%(link)s</pre>''' \ | |
438 | % {'link': |
|
438 | % {'link': | |
439 | _link_to_if(cond_new, change['new_lineno'], '#%s' \ |
|
439 | _link_to_if(cond_new, change['new_lineno'], '#%s' \ | |
440 | % anchor_new)}) |
|
440 | % anchor_new)}) | |
441 | _html.append('''</td>\n''') |
|
441 | _html.append('''</td>\n''') | |
442 | ########################################################### |
|
442 | ########################################################### | |
443 | # CODE |
|
443 | # CODE | |
444 | ########################################################### |
|
444 | ########################################################### | |
445 | _html.append('''\t<td class="%(code_class)s">''' \ |
|
445 | _html.append('''\t<td class="%(code_class)s">''' \ | |
446 | % {'code_class': code_class}) |
|
446 | % {'code_class': code_class}) | |
447 | _html.append('''\n\t\t<pre>%(code)s</pre>\n''' \ |
|
447 | _html.append('''\n\t\t<pre>%(code)s</pre>\n''' \ | |
448 | % {'code': change['line']}) |
|
448 | % {'code': change['line']}) | |
449 | _html.append('''\t</td>''') |
|
449 | _html.append('''\t</td>''') | |
450 | _html.append('''\n</tr>\n''') |
|
450 | _html.append('''\n</tr>\n''') | |
451 | _html.append('''</table>''') |
|
451 | _html.append('''</table>''') | |
452 | if _html_empty: |
|
452 | if _html_empty: | |
453 | return None |
|
453 | return None | |
454 | return ''.join(_html) |
|
454 | return ''.join(_html) | |
455 |
|
455 | |||
456 | def stat(self): |
|
456 | def stat(self): | |
457 | """ |
|
457 | """ | |
458 | Returns tuple of adde,and removed lines for this instance |
|
458 | Returns tuple of adde,and removed lines for this instance | |
459 | """ |
|
459 | """ | |
460 | return self.adds, self.removes |
|
460 | return self.adds, self.removes |
@@ -1,143 +1,148 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | """ |
|
2 | """ | |
3 | rhodecode.model.comment |
|
3 | rhodecode.model.comment | |
4 | ~~~~~~~~~~~~~~~~~~~~~~~ |
|
4 | ~~~~~~~~~~~~~~~~~~~~~~~ | |
5 |
|
5 | |||
6 | comments model for RhodeCode |
|
6 | comments model for RhodeCode | |
7 |
|
7 | |||
8 | :created_on: Nov 11, 2011 |
|
8 | :created_on: Nov 11, 2011 | |
9 | :author: marcink |
|
9 | :author: marcink | |
10 | :copyright: (C) 2011-2012 Marcin Kuzminski <marcin@python-works.com> |
|
10 | :copyright: (C) 2011-2012 Marcin Kuzminski <marcin@python-works.com> | |
11 | :license: GPLv3, see COPYING for more details. |
|
11 | :license: GPLv3, see COPYING for more details. | |
12 | """ |
|
12 | """ | |
13 | # This program is free software: you can redistribute it and/or modify |
|
13 | # This program is free software: you can redistribute it and/or modify | |
14 | # it under the terms of the GNU General Public License as published by |
|
14 | # it under the terms of the GNU General Public License as published by | |
15 | # the Free Software Foundation, either version 3 of the License, or |
|
15 | # the Free Software Foundation, either version 3 of the License, or | |
16 | # (at your option) any later version. |
|
16 | # (at your option) any later version. | |
17 | # |
|
17 | # | |
18 | # This program is distributed in the hope that it will be useful, |
|
18 | # This program is distributed in the hope that it will be useful, | |
19 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
19 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
20 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
20 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
21 | # GNU General Public License for more details. |
|
21 | # GNU General Public License for more details. | |
22 | # |
|
22 | # | |
23 | # You should have received a copy of the GNU General Public License |
|
23 | # You should have received a copy of the GNU General Public License | |
24 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
24 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
25 |
|
25 | |||
26 | import logging |
|
26 | import logging | |
27 | import traceback |
|
27 | import traceback | |
28 |
|
28 | |||
29 | from pylons.i18n.translation import _ |
|
29 | from pylons.i18n.translation import _ | |
30 | from sqlalchemy.util.compat import defaultdict |
|
30 | from sqlalchemy.util.compat import defaultdict | |
31 |
|
31 | |||
32 | from rhodecode.lib import extract_mentioned_users |
|
32 | from rhodecode.lib import extract_mentioned_users | |
33 | from rhodecode.lib import helpers as h |
|
33 | from rhodecode.lib import helpers as h | |
34 | from rhodecode.model import BaseModel |
|
34 | from rhodecode.model import BaseModel | |
35 | from rhodecode.model.db import ChangesetComment, User, Repository, Notification |
|
35 | from rhodecode.model.db import ChangesetComment, User, Repository, Notification | |
36 | from rhodecode.model.notification import NotificationModel |
|
36 | from rhodecode.model.notification import NotificationModel | |
37 |
|
37 | |||
38 | log = logging.getLogger(__name__) |
|
38 | log = logging.getLogger(__name__) | |
39 |
|
39 | |||
40 |
|
40 | |||
41 | class ChangesetCommentsModel(BaseModel): |
|
41 | class ChangesetCommentsModel(BaseModel): | |
42 |
|
42 | |||
43 | def __get_changeset_comment(self, changeset_comment): |
|
43 | def __get_changeset_comment(self, changeset_comment): | |
44 | return self._get_instance(ChangesetComment, changeset_comment) |
|
44 | return self._get_instance(ChangesetComment, changeset_comment) | |
45 |
|
45 | |||
46 | def _extract_mentions(self, s): |
|
46 | def _extract_mentions(self, s): | |
47 | user_objects = [] |
|
47 | user_objects = [] | |
48 | for username in extract_mentioned_users(s): |
|
48 | for username in extract_mentioned_users(s): | |
49 | user_obj = User.get_by_username(username, case_insensitive=True) |
|
49 | user_obj = User.get_by_username(username, case_insensitive=True) | |
50 | if user_obj: |
|
50 | if user_obj: | |
51 | user_objects.append(user_obj) |
|
51 | user_objects.append(user_obj) | |
52 | return user_objects |
|
52 | return user_objects | |
53 |
|
53 | |||
54 | def create(self, text, repo_id, user_id, revision, f_path=None, |
|
54 | def create(self, text, repo_id, user_id, revision, f_path=None, | |
55 | line_no=None): |
|
55 | line_no=None): | |
56 | """ |
|
56 | """ | |
57 | Creates new comment for changeset |
|
57 | Creates new comment for changeset | |
58 |
|
58 | |||
59 | :param text: |
|
59 | :param text: | |
60 | :param repo_id: |
|
60 | :param repo_id: | |
61 | :param user_id: |
|
61 | :param user_id: | |
62 | :param revision: |
|
62 | :param revision: | |
63 | :param f_path: |
|
63 | :param f_path: | |
64 | :param line_no: |
|
64 | :param line_no: | |
65 | """ |
|
65 | """ | |
66 | if text: |
|
66 | if text: | |
67 | repo = Repository.get(repo_id) |
|
67 | repo = Repository.get(repo_id) | |
68 | cs = repo.scm_instance.get_changeset(revision) |
|
68 | cs = repo.scm_instance.get_changeset(revision) | |
69 | desc = cs.message |
|
69 | desc = cs.message | |
70 | author = cs.author_email |
|
70 | author_email = cs.author_email | |
71 | comment = ChangesetComment() |
|
71 | comment = ChangesetComment() | |
72 | comment.repo = repo |
|
72 | comment.repo = repo | |
73 | comment.user_id = user_id |
|
73 | comment.user_id = user_id | |
74 | comment.revision = revision |
|
74 | comment.revision = revision | |
75 | comment.text = text |
|
75 | comment.text = text | |
76 | comment.f_path = f_path |
|
76 | comment.f_path = f_path | |
77 | comment.line_no = line_no |
|
77 | comment.line_no = line_no | |
78 |
|
78 | |||
79 | self.sa.add(comment) |
|
79 | self.sa.add(comment) | |
80 | self.sa.flush() |
|
80 | self.sa.flush() | |
81 |
|
81 | |||
82 | # make notification |
|
82 | # make notification | |
83 | line = '' |
|
83 | line = '' | |
84 | if line_no: |
|
84 | if line_no: | |
85 | line = _('on line %s') % line_no |
|
85 | line = _('on line %s') % line_no | |
86 | subj = h.link_to('Re commit: %(commit_desc)s %(line)s' % \ |
|
86 | subj = h.link_to('Re commit: %(commit_desc)s %(line)s' % \ | |
87 | {'commit_desc': desc, 'line': line}, |
|
87 | {'commit_desc': desc, 'line': line}, | |
88 | h.url('changeset_home', repo_name=repo.repo_name, |
|
88 | h.url('changeset_home', repo_name=repo.repo_name, | |
89 | revision=revision, |
|
89 | revision=revision, | |
90 | anchor='comment-%s' % comment.comment_id, |
|
90 | anchor='comment-%s' % comment.comment_id, | |
91 | qualified=True, |
|
91 | qualified=True, | |
92 | ) |
|
92 | ) | |
93 | ) |
|
93 | ) | |
94 | body = text |
|
94 | body = text | |
|
95 | ||||
|
96 | # get the current participants of this changeset | |||
95 | recipients = ChangesetComment.get_users(revision=revision) |
|
97 | recipients = ChangesetComment.get_users(revision=revision) | |
96 | # add changeset author |
|
|||
97 | recipients += [User.get_by_email(author)] |
|
|||
98 |
|
98 | |||
99 | NotificationModel().create(created_by=user_id, subject=subj, |
|
99 | # add changeset author if it's in rhodecode system | |
100 | body=body, recipients=recipients, |
|
100 | recipients += [User.get_by_email(author_email)] | |
101 | type_=Notification.TYPE_CHANGESET_COMMENT) |
|
101 | ||
|
102 | NotificationModel().create( | |||
|
103 | created_by=user_id, subject=subj, body=body, | |||
|
104 | recipients=recipients, type_=Notification.TYPE_CHANGESET_COMMENT | |||
|
105 | ) | |||
102 |
|
106 | |||
103 | mention_recipients = set(self._extract_mentions(body))\ |
|
107 | mention_recipients = set(self._extract_mentions(body))\ | |
104 | .difference(recipients) |
|
108 | .difference(recipients) | |
105 | if mention_recipients: |
|
109 | if mention_recipients: | |
106 | subj = _('[Mention]') + ' ' + subj |
|
110 | subj = _('[Mention]') + ' ' + subj | |
107 |
NotificationModel().create( |
|
111 | NotificationModel().create( | |
108 | body=body, |
|
112 | created_by=user_id, subject=subj, body=body, | |
109 |
|
|
113 | recipients=mention_recipients, | |
110 |
|
|
114 | type_=Notification.TYPE_CHANGESET_COMMENT | |
|
115 | ) | |||
111 |
|
116 | |||
112 | return comment |
|
117 | return comment | |
113 |
|
118 | |||
114 | def delete(self, comment): |
|
119 | def delete(self, comment): | |
115 | """ |
|
120 | """ | |
116 | Deletes given comment |
|
121 | Deletes given comment | |
117 |
|
122 | |||
118 | :param comment_id: |
|
123 | :param comment_id: | |
119 | """ |
|
124 | """ | |
120 | comment = self.__get_changeset_comment(comment) |
|
125 | comment = self.__get_changeset_comment(comment) | |
121 | self.sa.delete(comment) |
|
126 | self.sa.delete(comment) | |
122 |
|
127 | |||
123 | return comment |
|
128 | return comment | |
124 |
|
129 | |||
125 | def get_comments(self, repo_id, revision): |
|
130 | def get_comments(self, repo_id, revision): | |
126 | return ChangesetComment.query()\ |
|
131 | return ChangesetComment.query()\ | |
127 | .filter(ChangesetComment.repo_id == repo_id)\ |
|
132 | .filter(ChangesetComment.repo_id == repo_id)\ | |
128 | .filter(ChangesetComment.revision == revision)\ |
|
133 | .filter(ChangesetComment.revision == revision)\ | |
129 | .filter(ChangesetComment.line_no == None)\ |
|
134 | .filter(ChangesetComment.line_no == None)\ | |
130 | .filter(ChangesetComment.f_path == None).all() |
|
135 | .filter(ChangesetComment.f_path == None).all() | |
131 |
|
136 | |||
132 | def get_inline_comments(self, repo_id, revision): |
|
137 | def get_inline_comments(self, repo_id, revision): | |
133 | comments = self.sa.query(ChangesetComment)\ |
|
138 | comments = self.sa.query(ChangesetComment)\ | |
134 | .filter(ChangesetComment.repo_id == repo_id)\ |
|
139 | .filter(ChangesetComment.repo_id == repo_id)\ | |
135 | .filter(ChangesetComment.revision == revision)\ |
|
140 | .filter(ChangesetComment.revision == revision)\ | |
136 | .filter(ChangesetComment.line_no != None)\ |
|
141 | .filter(ChangesetComment.line_no != None)\ | |
137 | .filter(ChangesetComment.f_path != None).all() |
|
142 | .filter(ChangesetComment.f_path != None).all() | |
138 |
|
143 | |||
139 | paths = defaultdict(lambda: defaultdict(list)) |
|
144 | paths = defaultdict(lambda: defaultdict(list)) | |
140 |
|
145 | |||
141 | for co in comments: |
|
146 | for co in comments: | |
142 | paths[co.f_path][co.line_no].append(co) |
|
147 | paths[co.f_path][co.line_no].append(co) | |
143 | return paths.items() |
|
148 | return paths.items() |
@@ -1,1216 +1,1218 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | """ |
|
2 | """ | |
3 | rhodecode.model.db |
|
3 | rhodecode.model.db | |
4 | ~~~~~~~~~~~~~~~~~~ |
|
4 | ~~~~~~~~~~~~~~~~~~ | |
5 |
|
5 | |||
6 | Database Models for RhodeCode |
|
6 | Database Models for RhodeCode | |
7 |
|
7 | |||
8 | :created_on: Apr 08, 2010 |
|
8 | :created_on: Apr 08, 2010 | |
9 | :author: marcink |
|
9 | :author: marcink | |
10 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> |
|
10 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> | |
11 | :license: GPLv3, see COPYING for more details. |
|
11 | :license: GPLv3, see COPYING for more details. | |
12 | """ |
|
12 | """ | |
13 | # This program is free software: you can redistribute it and/or modify |
|
13 | # This program is free software: you can redistribute it and/or modify | |
14 | # it under the terms of the GNU General Public License as published by |
|
14 | # it under the terms of the GNU General Public License as published by | |
15 | # the Free Software Foundation, either version 3 of the License, or |
|
15 | # the Free Software Foundation, either version 3 of the License, or | |
16 | # (at your option) any later version. |
|
16 | # (at your option) any later version. | |
17 | # |
|
17 | # | |
18 | # This program is distributed in the hope that it will be useful, |
|
18 | # This program is distributed in the hope that it will be useful, | |
19 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
19 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
20 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
20 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
21 | # GNU General Public License for more details. |
|
21 | # GNU General Public License for more details. | |
22 | # |
|
22 | # | |
23 | # You should have received a copy of the GNU General Public License |
|
23 | # You should have received a copy of the GNU General Public License | |
24 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
24 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
25 |
|
25 | |||
26 | import os |
|
26 | import os | |
27 | import logging |
|
27 | import logging | |
28 | import datetime |
|
28 | import datetime | |
29 | import traceback |
|
29 | import traceback | |
30 | from collections import defaultdict |
|
30 | from collections import defaultdict | |
31 |
|
31 | |||
32 | from sqlalchemy import * |
|
32 | from sqlalchemy import * | |
33 | from sqlalchemy.ext.hybrid import hybrid_property |
|
33 | from sqlalchemy.ext.hybrid import hybrid_property | |
34 | from sqlalchemy.orm import relationship, joinedload, class_mapper, validates |
|
34 | from sqlalchemy.orm import relationship, joinedload, class_mapper, validates | |
35 | from beaker.cache import cache_region, region_invalidate |
|
35 | from beaker.cache import cache_region, region_invalidate | |
36 |
|
36 | |||
37 | from rhodecode.lib.vcs import get_backend |
|
37 | from rhodecode.lib.vcs import get_backend | |
38 | from rhodecode.lib.vcs.utils.helpers import get_scm |
|
38 | from rhodecode.lib.vcs.utils.helpers import get_scm | |
39 | from rhodecode.lib.vcs.exceptions import VCSError |
|
39 | from rhodecode.lib.vcs.exceptions import VCSError | |
40 | from rhodecode.lib.vcs.utils.lazy import LazyProperty |
|
40 | from rhodecode.lib.vcs.utils.lazy import LazyProperty | |
41 |
|
41 | |||
42 | from rhodecode.lib import str2bool, safe_str, get_changeset_safe, safe_unicode |
|
42 | from rhodecode.lib import str2bool, safe_str, get_changeset_safe, safe_unicode | |
43 | from rhodecode.lib.compat import json |
|
43 | from rhodecode.lib.compat import json | |
44 | from rhodecode.lib.caching_query import FromCache |
|
44 | from rhodecode.lib.caching_query import FromCache | |
45 |
|
45 | |||
46 | from rhodecode.model.meta import Base, Session |
|
46 | from rhodecode.model.meta import Base, Session | |
47 | import hashlib |
|
47 | import hashlib | |
48 |
|
48 | |||
49 |
|
49 | |||
50 | log = logging.getLogger(__name__) |
|
50 | log = logging.getLogger(__name__) | |
51 |
|
51 | |||
52 | #============================================================================== |
|
52 | #============================================================================== | |
53 | # BASE CLASSES |
|
53 | # BASE CLASSES | |
54 | #============================================================================== |
|
54 | #============================================================================== | |
55 |
|
55 | |||
56 | _hash_key = lambda k: hashlib.md5(safe_str(k)).hexdigest() |
|
56 | _hash_key = lambda k: hashlib.md5(safe_str(k)).hexdigest() | |
57 |
|
57 | |||
58 |
|
58 | |||
59 | class ModelSerializer(json.JSONEncoder): |
|
59 | class ModelSerializer(json.JSONEncoder): | |
60 | """ |
|
60 | """ | |
61 | Simple Serializer for JSON, |
|
61 | Simple Serializer for JSON, | |
62 |
|
62 | |||
63 | usage:: |
|
63 | usage:: | |
64 |
|
64 | |||
65 | to make object customized for serialization implement a __json__ |
|
65 | to make object customized for serialization implement a __json__ | |
66 | method that will return a dict for serialization into json |
|
66 | method that will return a dict for serialization into json | |
67 |
|
67 | |||
68 | example:: |
|
68 | example:: | |
69 |
|
69 | |||
70 | class Task(object): |
|
70 | class Task(object): | |
71 |
|
71 | |||
72 | def __init__(self, name, value): |
|
72 | def __init__(self, name, value): | |
73 | self.name = name |
|
73 | self.name = name | |
74 | self.value = value |
|
74 | self.value = value | |
75 |
|
75 | |||
76 | def __json__(self): |
|
76 | def __json__(self): | |
77 | return dict(name=self.name, |
|
77 | return dict(name=self.name, | |
78 | value=self.value) |
|
78 | value=self.value) | |
79 |
|
79 | |||
80 | """ |
|
80 | """ | |
81 |
|
81 | |||
82 | def default(self, obj): |
|
82 | def default(self, obj): | |
83 |
|
83 | |||
84 | if hasattr(obj, '__json__'): |
|
84 | if hasattr(obj, '__json__'): | |
85 | return obj.__json__() |
|
85 | return obj.__json__() | |
86 | else: |
|
86 | else: | |
87 | return json.JSONEncoder.default(self, obj) |
|
87 | return json.JSONEncoder.default(self, obj) | |
88 |
|
88 | |||
89 |
|
89 | |||
90 | class BaseModel(object): |
|
90 | class BaseModel(object): | |
91 | """ |
|
91 | """ | |
92 | Base Model for all classess |
|
92 | Base Model for all classess | |
93 | """ |
|
93 | """ | |
94 |
|
94 | |||
95 | @classmethod |
|
95 | @classmethod | |
96 | def _get_keys(cls): |
|
96 | def _get_keys(cls): | |
97 | """return column names for this model """ |
|
97 | """return column names for this model """ | |
98 | return class_mapper(cls).c.keys() |
|
98 | return class_mapper(cls).c.keys() | |
99 |
|
99 | |||
100 | def get_dict(self): |
|
100 | def get_dict(self): | |
101 | """ |
|
101 | """ | |
102 | return dict with keys and values corresponding |
|
102 | return dict with keys and values corresponding | |
103 | to this model data """ |
|
103 | to this model data """ | |
104 |
|
104 | |||
105 | d = {} |
|
105 | d = {} | |
106 | for k in self._get_keys(): |
|
106 | for k in self._get_keys(): | |
107 | d[k] = getattr(self, k) |
|
107 | d[k] = getattr(self, k) | |
108 |
|
108 | |||
109 | # also use __json__() if present to get additional fields |
|
109 | # also use __json__() if present to get additional fields | |
110 | for k, val in getattr(self, '__json__', lambda: {})().iteritems(): |
|
110 | for k, val in getattr(self, '__json__', lambda: {})().iteritems(): | |
111 | d[k] = val |
|
111 | d[k] = val | |
112 | return d |
|
112 | return d | |
113 |
|
113 | |||
114 | def get_appstruct(self): |
|
114 | def get_appstruct(self): | |
115 | """return list with keys and values tupples corresponding |
|
115 | """return list with keys and values tupples corresponding | |
116 | to this model data """ |
|
116 | to this model data """ | |
117 |
|
117 | |||
118 | l = [] |
|
118 | l = [] | |
119 | for k in self._get_keys(): |
|
119 | for k in self._get_keys(): | |
120 | l.append((k, getattr(self, k),)) |
|
120 | l.append((k, getattr(self, k),)) | |
121 | return l |
|
121 | return l | |
122 |
|
122 | |||
123 | def populate_obj(self, populate_dict): |
|
123 | def populate_obj(self, populate_dict): | |
124 | """populate model with data from given populate_dict""" |
|
124 | """populate model with data from given populate_dict""" | |
125 |
|
125 | |||
126 | for k in self._get_keys(): |
|
126 | for k in self._get_keys(): | |
127 | if k in populate_dict: |
|
127 | if k in populate_dict: | |
128 | setattr(self, k, populate_dict[k]) |
|
128 | setattr(self, k, populate_dict[k]) | |
129 |
|
129 | |||
130 | @classmethod |
|
130 | @classmethod | |
131 | def query(cls): |
|
131 | def query(cls): | |
132 | return Session.query(cls) |
|
132 | return Session.query(cls) | |
133 |
|
133 | |||
134 | @classmethod |
|
134 | @classmethod | |
135 | def get(cls, id_): |
|
135 | def get(cls, id_): | |
136 | if id_: |
|
136 | if id_: | |
137 | return cls.query().get(id_) |
|
137 | return cls.query().get(id_) | |
138 |
|
138 | |||
139 | @classmethod |
|
139 | @classmethod | |
140 | def getAll(cls): |
|
140 | def getAll(cls): | |
141 | return cls.query().all() |
|
141 | return cls.query().all() | |
142 |
|
142 | |||
143 | @classmethod |
|
143 | @classmethod | |
144 | def delete(cls, id_): |
|
144 | def delete(cls, id_): | |
145 | obj = cls.query().get(id_) |
|
145 | obj = cls.query().get(id_) | |
146 | Session.delete(obj) |
|
146 | Session.delete(obj) | |
147 |
|
147 | |||
148 |
|
148 | |||
149 | class RhodeCodeSetting(Base, BaseModel): |
|
149 | class RhodeCodeSetting(Base, BaseModel): | |
150 | __tablename__ = 'rhodecode_settings' |
|
150 | __tablename__ = 'rhodecode_settings' | |
151 | __table_args__ = ( |
|
151 | __table_args__ = ( | |
152 | UniqueConstraint('app_settings_name'), |
|
152 | UniqueConstraint('app_settings_name'), | |
153 | {'extend_existing': True} |
|
153 | {'extend_existing': True} | |
154 | ) |
|
154 | ) | |
155 | app_settings_id = Column("app_settings_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
155 | app_settings_id = Column("app_settings_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
156 | app_settings_name = Column("app_settings_name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
156 | app_settings_name = Column("app_settings_name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
157 | _app_settings_value = Column("app_settings_value", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
157 | _app_settings_value = Column("app_settings_value", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
158 |
|
158 | |||
159 | def __init__(self, k='', v=''): |
|
159 | def __init__(self, k='', v=''): | |
160 | self.app_settings_name = k |
|
160 | self.app_settings_name = k | |
161 | self.app_settings_value = v |
|
161 | self.app_settings_value = v | |
162 |
|
162 | |||
163 | @validates('_app_settings_value') |
|
163 | @validates('_app_settings_value') | |
164 | def validate_settings_value(self, key, val): |
|
164 | def validate_settings_value(self, key, val): | |
165 | assert type(val) == unicode |
|
165 | assert type(val) == unicode | |
166 | return val |
|
166 | return val | |
167 |
|
167 | |||
168 | @hybrid_property |
|
168 | @hybrid_property | |
169 | def app_settings_value(self): |
|
169 | def app_settings_value(self): | |
170 | v = self._app_settings_value |
|
170 | v = self._app_settings_value | |
171 | if self.app_settings_name == 'ldap_active': |
|
171 | if self.app_settings_name == 'ldap_active': | |
172 | v = str2bool(v) |
|
172 | v = str2bool(v) | |
173 | return v |
|
173 | return v | |
174 |
|
174 | |||
175 | @app_settings_value.setter |
|
175 | @app_settings_value.setter | |
176 | def app_settings_value(self, val): |
|
176 | def app_settings_value(self, val): | |
177 | """ |
|
177 | """ | |
178 | Setter that will always make sure we use unicode in app_settings_value |
|
178 | Setter that will always make sure we use unicode in app_settings_value | |
179 |
|
179 | |||
180 | :param val: |
|
180 | :param val: | |
181 | """ |
|
181 | """ | |
182 | self._app_settings_value = safe_unicode(val) |
|
182 | self._app_settings_value = safe_unicode(val) | |
183 |
|
183 | |||
184 | def __repr__(self): |
|
184 | def __repr__(self): | |
185 | return "<%s('%s:%s')>" % ( |
|
185 | return "<%s('%s:%s')>" % ( | |
186 | self.__class__.__name__, |
|
186 | self.__class__.__name__, | |
187 | self.app_settings_name, self.app_settings_value |
|
187 | self.app_settings_name, self.app_settings_value | |
188 | ) |
|
188 | ) | |
189 |
|
189 | |||
190 | @classmethod |
|
190 | @classmethod | |
191 | def get_by_name(cls, ldap_key): |
|
191 | def get_by_name(cls, ldap_key): | |
192 | return cls.query()\ |
|
192 | return cls.query()\ | |
193 | .filter(cls.app_settings_name == ldap_key).scalar() |
|
193 | .filter(cls.app_settings_name == ldap_key).scalar() | |
194 |
|
194 | |||
195 | @classmethod |
|
195 | @classmethod | |
196 | def get_app_settings(cls, cache=False): |
|
196 | def get_app_settings(cls, cache=False): | |
197 |
|
197 | |||
198 | ret = cls.query() |
|
198 | ret = cls.query() | |
199 |
|
199 | |||
200 | if cache: |
|
200 | if cache: | |
201 | ret = ret.options(FromCache("sql_cache_short", "get_hg_settings")) |
|
201 | ret = ret.options(FromCache("sql_cache_short", "get_hg_settings")) | |
202 |
|
202 | |||
203 | if not ret: |
|
203 | if not ret: | |
204 | raise Exception('Could not get application settings !') |
|
204 | raise Exception('Could not get application settings !') | |
205 | settings = {} |
|
205 | settings = {} | |
206 | for each in ret: |
|
206 | for each in ret: | |
207 | settings['rhodecode_' + each.app_settings_name] = \ |
|
207 | settings['rhodecode_' + each.app_settings_name] = \ | |
208 | each.app_settings_value |
|
208 | each.app_settings_value | |
209 |
|
209 | |||
210 | return settings |
|
210 | return settings | |
211 |
|
211 | |||
212 | @classmethod |
|
212 | @classmethod | |
213 | def get_ldap_settings(cls, cache=False): |
|
213 | def get_ldap_settings(cls, cache=False): | |
214 | ret = cls.query()\ |
|
214 | ret = cls.query()\ | |
215 | .filter(cls.app_settings_name.startswith('ldap_')).all() |
|
215 | .filter(cls.app_settings_name.startswith('ldap_')).all() | |
216 | fd = {} |
|
216 | fd = {} | |
217 | for row in ret: |
|
217 | for row in ret: | |
218 | fd.update({row.app_settings_name:row.app_settings_value}) |
|
218 | fd.update({row.app_settings_name:row.app_settings_value}) | |
219 |
|
219 | |||
220 | return fd |
|
220 | return fd | |
221 |
|
221 | |||
222 |
|
222 | |||
223 | class RhodeCodeUi(Base, BaseModel): |
|
223 | class RhodeCodeUi(Base, BaseModel): | |
224 | __tablename__ = 'rhodecode_ui' |
|
224 | __tablename__ = 'rhodecode_ui' | |
225 | __table_args__ = ( |
|
225 | __table_args__ = ( | |
226 | UniqueConstraint('ui_key'), |
|
226 | UniqueConstraint('ui_key'), | |
227 | {'extend_existing': True} |
|
227 | {'extend_existing': True} | |
228 | ) |
|
228 | ) | |
229 |
|
229 | |||
230 | HOOK_UPDATE = 'changegroup.update' |
|
230 | HOOK_UPDATE = 'changegroup.update' | |
231 | HOOK_REPO_SIZE = 'changegroup.repo_size' |
|
231 | HOOK_REPO_SIZE = 'changegroup.repo_size' | |
232 | HOOK_PUSH = 'pretxnchangegroup.push_logger' |
|
232 | HOOK_PUSH = 'pretxnchangegroup.push_logger' | |
233 | HOOK_PULL = 'preoutgoing.pull_logger' |
|
233 | HOOK_PULL = 'preoutgoing.pull_logger' | |
234 |
|
234 | |||
235 | ui_id = Column("ui_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
235 | ui_id = Column("ui_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
236 | ui_section = Column("ui_section", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
236 | ui_section = Column("ui_section", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
237 | ui_key = Column("ui_key", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
237 | ui_key = Column("ui_key", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
238 | ui_value = Column("ui_value", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
238 | ui_value = Column("ui_value", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
239 | ui_active = Column("ui_active", Boolean(), nullable=True, unique=None, default=True) |
|
239 | ui_active = Column("ui_active", Boolean(), nullable=True, unique=None, default=True) | |
240 |
|
240 | |||
241 | @classmethod |
|
241 | @classmethod | |
242 | def get_by_key(cls, key): |
|
242 | def get_by_key(cls, key): | |
243 | return cls.query().filter(cls.ui_key == key) |
|
243 | return cls.query().filter(cls.ui_key == key) | |
244 |
|
244 | |||
245 | @classmethod |
|
245 | @classmethod | |
246 | def get_builtin_hooks(cls): |
|
246 | def get_builtin_hooks(cls): | |
247 | q = cls.query() |
|
247 | q = cls.query() | |
248 | q = q.filter(cls.ui_key.in_([cls.HOOK_UPDATE, |
|
248 | q = q.filter(cls.ui_key.in_([cls.HOOK_UPDATE, | |
249 | cls.HOOK_REPO_SIZE, |
|
249 | cls.HOOK_REPO_SIZE, | |
250 | cls.HOOK_PUSH, cls.HOOK_PULL])) |
|
250 | cls.HOOK_PUSH, cls.HOOK_PULL])) | |
251 | return q.all() |
|
251 | return q.all() | |
252 |
|
252 | |||
253 | @classmethod |
|
253 | @classmethod | |
254 | def get_custom_hooks(cls): |
|
254 | def get_custom_hooks(cls): | |
255 | q = cls.query() |
|
255 | q = cls.query() | |
256 | q = q.filter(~cls.ui_key.in_([cls.HOOK_UPDATE, |
|
256 | q = q.filter(~cls.ui_key.in_([cls.HOOK_UPDATE, | |
257 | cls.HOOK_REPO_SIZE, |
|
257 | cls.HOOK_REPO_SIZE, | |
258 | cls.HOOK_PUSH, cls.HOOK_PULL])) |
|
258 | cls.HOOK_PUSH, cls.HOOK_PULL])) | |
259 | q = q.filter(cls.ui_section == 'hooks') |
|
259 | q = q.filter(cls.ui_section == 'hooks') | |
260 | return q.all() |
|
260 | return q.all() | |
261 |
|
261 | |||
262 | @classmethod |
|
262 | @classmethod | |
263 | def create_or_update_hook(cls, key, val): |
|
263 | def create_or_update_hook(cls, key, val): | |
264 | new_ui = cls.get_by_key(key).scalar() or cls() |
|
264 | new_ui = cls.get_by_key(key).scalar() or cls() | |
265 | new_ui.ui_section = 'hooks' |
|
265 | new_ui.ui_section = 'hooks' | |
266 | new_ui.ui_active = True |
|
266 | new_ui.ui_active = True | |
267 | new_ui.ui_key = key |
|
267 | new_ui.ui_key = key | |
268 | new_ui.ui_value = val |
|
268 | new_ui.ui_value = val | |
269 |
|
269 | |||
270 | Session.add(new_ui) |
|
270 | Session.add(new_ui) | |
271 |
|
271 | |||
272 |
|
272 | |||
273 | class User(Base, BaseModel): |
|
273 | class User(Base, BaseModel): | |
274 | __tablename__ = 'users' |
|
274 | __tablename__ = 'users' | |
275 | __table_args__ = ( |
|
275 | __table_args__ = ( | |
276 | UniqueConstraint('username'), UniqueConstraint('email'), |
|
276 | UniqueConstraint('username'), UniqueConstraint('email'), | |
277 | {'extend_existing': True} |
|
277 | {'extend_existing': True} | |
278 | ) |
|
278 | ) | |
279 | user_id = Column("user_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
279 | user_id = Column("user_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
280 | username = Column("username", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
280 | username = Column("username", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
281 | password = Column("password", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
281 | password = Column("password", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
282 | active = Column("active", Boolean(), nullable=True, unique=None, default=None) |
|
282 | active = Column("active", Boolean(), nullable=True, unique=None, default=None) | |
283 | admin = Column("admin", Boolean(), nullable=True, unique=None, default=False) |
|
283 | admin = Column("admin", Boolean(), nullable=True, unique=None, default=False) | |
284 | name = Column("name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
284 | name = Column("name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
285 | lastname = Column("lastname", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
285 | lastname = Column("lastname", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
286 | _email = Column("email", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
286 | _email = Column("email", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
287 | last_login = Column("last_login", DateTime(timezone=False), nullable=True, unique=None, default=None) |
|
287 | last_login = Column("last_login", DateTime(timezone=False), nullable=True, unique=None, default=None) | |
288 | ldap_dn = Column("ldap_dn", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
288 | ldap_dn = Column("ldap_dn", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
289 | api_key = Column("api_key", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
289 | api_key = Column("api_key", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
290 |
|
290 | |||
291 | user_log = relationship('UserLog', cascade='all') |
|
291 | user_log = relationship('UserLog', cascade='all') | |
292 | user_perms = relationship('UserToPerm', primaryjoin="User.user_id==UserToPerm.user_id", cascade='all') |
|
292 | user_perms = relationship('UserToPerm', primaryjoin="User.user_id==UserToPerm.user_id", cascade='all') | |
293 |
|
293 | |||
294 | repositories = relationship('Repository') |
|
294 | repositories = relationship('Repository') | |
295 | user_followers = relationship('UserFollowing', primaryjoin='UserFollowing.follows_user_id==User.user_id', cascade='all') |
|
295 | user_followers = relationship('UserFollowing', primaryjoin='UserFollowing.follows_user_id==User.user_id', cascade='all') | |
296 | repo_to_perm = relationship('UserRepoToPerm', primaryjoin='UserRepoToPerm.user_id==User.user_id', cascade='all') |
|
296 | repo_to_perm = relationship('UserRepoToPerm', primaryjoin='UserRepoToPerm.user_id==User.user_id', cascade='all') | |
297 |
|
297 | |||
298 | group_member = relationship('UsersGroupMember', cascade='all') |
|
298 | group_member = relationship('UsersGroupMember', cascade='all') | |
299 |
|
299 | |||
300 | notifications = relationship('UserNotification',) |
|
300 | notifications = relationship('UserNotification',) | |
301 |
|
301 | |||
302 | @hybrid_property |
|
302 | @hybrid_property | |
303 | def email(self): |
|
303 | def email(self): | |
304 | return self._email |
|
304 | return self._email | |
305 |
|
305 | |||
306 | @email.setter |
|
306 | @email.setter | |
307 | def email(self, val): |
|
307 | def email(self, val): | |
308 | self._email = val.lower() if val else None |
|
308 | self._email = val.lower() if val else None | |
309 |
|
309 | |||
310 | @property |
|
310 | @property | |
311 | def full_name(self): |
|
311 | def full_name(self): | |
312 | return '%s %s' % (self.name, self.lastname) |
|
312 | return '%s %s' % (self.name, self.lastname) | |
313 |
|
313 | |||
314 | @property |
|
314 | @property | |
315 | def full_name_or_username(self): |
|
315 | def full_name_or_username(self): | |
316 | return ('%s %s' % (self.name, self.lastname) |
|
316 | return ('%s %s' % (self.name, self.lastname) | |
317 | if (self.name and self.lastname) else self.username) |
|
317 | if (self.name and self.lastname) else self.username) | |
318 |
|
318 | |||
319 | @property |
|
319 | @property | |
320 | def full_contact(self): |
|
320 | def full_contact(self): | |
321 | return '%s %s <%s>' % (self.name, self.lastname, self.email) |
|
321 | return '%s %s <%s>' % (self.name, self.lastname, self.email) | |
322 |
|
322 | |||
323 | @property |
|
323 | @property | |
324 | def short_contact(self): |
|
324 | def short_contact(self): | |
325 | return '%s %s' % (self.name, self.lastname) |
|
325 | return '%s %s' % (self.name, self.lastname) | |
326 |
|
326 | |||
327 | @property |
|
327 | @property | |
328 | def is_admin(self): |
|
328 | def is_admin(self): | |
329 | return self.admin |
|
329 | return self.admin | |
330 |
|
330 | |||
331 | def __repr__(self): |
|
331 | def __repr__(self): | |
332 | return "<%s('id:%s:%s')>" % (self.__class__.__name__, |
|
332 | return "<%s('id:%s:%s')>" % (self.__class__.__name__, | |
333 | self.user_id, self.username) |
|
333 | self.user_id, self.username) | |
334 |
|
334 | |||
335 | @classmethod |
|
335 | @classmethod | |
336 | def get_by_username(cls, username, case_insensitive=False, cache=False): |
|
336 | def get_by_username(cls, username, case_insensitive=False, cache=False): | |
337 | if case_insensitive: |
|
337 | if case_insensitive: | |
338 | q = cls.query().filter(cls.username.ilike(username)) |
|
338 | q = cls.query().filter(cls.username.ilike(username)) | |
339 | else: |
|
339 | else: | |
340 | q = cls.query().filter(cls.username == username) |
|
340 | q = cls.query().filter(cls.username == username) | |
341 |
|
341 | |||
342 | if cache: |
|
342 | if cache: | |
343 | q = q.options(FromCache( |
|
343 | q = q.options(FromCache( | |
344 | "sql_cache_short", |
|
344 | "sql_cache_short", | |
345 | "get_user_%s" % _hash_key(username) |
|
345 | "get_user_%s" % _hash_key(username) | |
346 | ) |
|
346 | ) | |
347 | ) |
|
347 | ) | |
348 | return q.scalar() |
|
348 | return q.scalar() | |
349 |
|
349 | |||
350 | @classmethod |
|
350 | @classmethod | |
351 | def get_by_api_key(cls, api_key, cache=False): |
|
351 | def get_by_api_key(cls, api_key, cache=False): | |
352 | q = cls.query().filter(cls.api_key == api_key) |
|
352 | q = cls.query().filter(cls.api_key == api_key) | |
353 |
|
353 | |||
354 | if cache: |
|
354 | if cache: | |
355 | q = q.options(FromCache("sql_cache_short", |
|
355 | q = q.options(FromCache("sql_cache_short", | |
356 | "get_api_key_%s" % api_key)) |
|
356 | "get_api_key_%s" % api_key)) | |
357 | return q.scalar() |
|
357 | return q.scalar() | |
358 |
|
358 | |||
359 | @classmethod |
|
359 | @classmethod | |
360 | def get_by_email(cls, email, case_insensitive=False, cache=False): |
|
360 | def get_by_email(cls, email, case_insensitive=False, cache=False): | |
361 | if case_insensitive: |
|
361 | if case_insensitive: | |
362 | q = cls.query().filter(cls.email.ilike(email)) |
|
362 | q = cls.query().filter(cls.email.ilike(email)) | |
363 | else: |
|
363 | else: | |
364 | q = cls.query().filter(cls.email == email) |
|
364 | q = cls.query().filter(cls.email == email) | |
365 |
|
365 | |||
366 | if cache: |
|
366 | if cache: | |
367 | q = q.options(FromCache("sql_cache_short", |
|
367 | q = q.options(FromCache("sql_cache_short", | |
368 | "get_api_key_%s" % email)) |
|
368 | "get_api_key_%s" % email)) | |
369 | return q.scalar() |
|
369 | return q.scalar() | |
370 |
|
370 | |||
371 | def update_lastlogin(self): |
|
371 | def update_lastlogin(self): | |
372 | """Update user lastlogin""" |
|
372 | """Update user lastlogin""" | |
373 | self.last_login = datetime.datetime.now() |
|
373 | self.last_login = datetime.datetime.now() | |
374 | Session.add(self) |
|
374 | Session.add(self) | |
375 | log.debug('updated user %s lastlogin' % self.username) |
|
375 | log.debug('updated user %s lastlogin' % self.username) | |
376 |
|
376 | |||
377 | def __json__(self): |
|
377 | def __json__(self): | |
378 | return dict( |
|
378 | return dict( | |
379 | email=self.email, |
|
379 | email=self.email, | |
380 | full_name=self.full_name, |
|
380 | full_name=self.full_name, | |
381 | full_name_or_username=self.full_name_or_username, |
|
381 | full_name_or_username=self.full_name_or_username, | |
382 | short_contact=self.short_contact, |
|
382 | short_contact=self.short_contact, | |
383 | full_contact=self.full_contact |
|
383 | full_contact=self.full_contact | |
384 | ) |
|
384 | ) | |
385 |
|
385 | |||
386 |
|
386 | |||
387 | class UserLog(Base, BaseModel): |
|
387 | class UserLog(Base, BaseModel): | |
388 | __tablename__ = 'user_logs' |
|
388 | __tablename__ = 'user_logs' | |
389 | __table_args__ = {'extend_existing': True} |
|
389 | __table_args__ = {'extend_existing': True} | |
390 | user_log_id = Column("user_log_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
390 | user_log_id = Column("user_log_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
391 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) |
|
391 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) | |
392 | repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=True) |
|
392 | repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=True) | |
393 | repository_name = Column("repository_name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
393 | repository_name = Column("repository_name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
394 | user_ip = Column("user_ip", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
394 | user_ip = Column("user_ip", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
395 | action = Column("action", UnicodeText(length=1200000, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
395 | action = Column("action", UnicodeText(length=1200000, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
396 | action_date = Column("action_date", DateTime(timezone=False), nullable=True, unique=None, default=None) |
|
396 | action_date = Column("action_date", DateTime(timezone=False), nullable=True, unique=None, default=None) | |
397 |
|
397 | |||
398 | @property |
|
398 | @property | |
399 | def action_as_day(self): |
|
399 | def action_as_day(self): | |
400 | return datetime.date(*self.action_date.timetuple()[:3]) |
|
400 | return datetime.date(*self.action_date.timetuple()[:3]) | |
401 |
|
401 | |||
402 | user = relationship('User') |
|
402 | user = relationship('User') | |
403 | repository = relationship('Repository', cascade='') |
|
403 | repository = relationship('Repository', cascade='') | |
404 |
|
404 | |||
405 |
|
405 | |||
406 | class UsersGroup(Base, BaseModel): |
|
406 | class UsersGroup(Base, BaseModel): | |
407 | __tablename__ = 'users_groups' |
|
407 | __tablename__ = 'users_groups' | |
408 | __table_args__ = {'extend_existing': True} |
|
408 | __table_args__ = {'extend_existing': True} | |
409 |
|
409 | |||
410 | users_group_id = Column("users_group_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
410 | users_group_id = Column("users_group_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
411 | users_group_name = Column("users_group_name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=False, unique=True, default=None) |
|
411 | users_group_name = Column("users_group_name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=False, unique=True, default=None) | |
412 | users_group_active = Column("users_group_active", Boolean(), nullable=True, unique=None, default=None) |
|
412 | users_group_active = Column("users_group_active", Boolean(), nullable=True, unique=None, default=None) | |
413 |
|
413 | |||
414 | members = relationship('UsersGroupMember', cascade="all, delete, delete-orphan", lazy="joined") |
|
414 | members = relationship('UsersGroupMember', cascade="all, delete, delete-orphan", lazy="joined") | |
415 | users_group_to_perm = relationship('UsersGroupToPerm', cascade='all') |
|
415 | users_group_to_perm = relationship('UsersGroupToPerm', cascade='all') | |
416 |
|
416 | |||
417 | def __repr__(self): |
|
417 | def __repr__(self): | |
418 | return '<userGroup(%s)>' % (self.users_group_name) |
|
418 | return '<userGroup(%s)>' % (self.users_group_name) | |
419 |
|
419 | |||
420 | @classmethod |
|
420 | @classmethod | |
421 | def get_by_group_name(cls, group_name, cache=False, |
|
421 | def get_by_group_name(cls, group_name, cache=False, | |
422 | case_insensitive=False): |
|
422 | case_insensitive=False): | |
423 | if case_insensitive: |
|
423 | if case_insensitive: | |
424 | q = cls.query().filter(cls.users_group_name.ilike(group_name)) |
|
424 | q = cls.query().filter(cls.users_group_name.ilike(group_name)) | |
425 | else: |
|
425 | else: | |
426 | q = cls.query().filter(cls.users_group_name == group_name) |
|
426 | q = cls.query().filter(cls.users_group_name == group_name) | |
427 | if cache: |
|
427 | if cache: | |
428 | q = q.options(FromCache( |
|
428 | q = q.options(FromCache( | |
429 | "sql_cache_short", |
|
429 | "sql_cache_short", | |
430 | "get_user_%s" % _hash_key(group_name) |
|
430 | "get_user_%s" % _hash_key(group_name) | |
431 | ) |
|
431 | ) | |
432 | ) |
|
432 | ) | |
433 | return q.scalar() |
|
433 | return q.scalar() | |
434 |
|
434 | |||
435 | @classmethod |
|
435 | @classmethod | |
436 | def get(cls, users_group_id, cache=False): |
|
436 | def get(cls, users_group_id, cache=False): | |
437 | users_group = cls.query() |
|
437 | users_group = cls.query() | |
438 | if cache: |
|
438 | if cache: | |
439 | users_group = users_group.options(FromCache("sql_cache_short", |
|
439 | users_group = users_group.options(FromCache("sql_cache_short", | |
440 | "get_users_group_%s" % users_group_id)) |
|
440 | "get_users_group_%s" % users_group_id)) | |
441 | return users_group.get(users_group_id) |
|
441 | return users_group.get(users_group_id) | |
442 |
|
442 | |||
443 |
|
443 | |||
444 | class UsersGroupMember(Base, BaseModel): |
|
444 | class UsersGroupMember(Base, BaseModel): | |
445 | __tablename__ = 'users_groups_members' |
|
445 | __tablename__ = 'users_groups_members' | |
446 | __table_args__ = {'extend_existing': True} |
|
446 | __table_args__ = {'extend_existing': True} | |
447 |
|
447 | |||
448 | users_group_member_id = Column("users_group_member_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
448 | users_group_member_id = Column("users_group_member_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
449 | users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None) |
|
449 | users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None) | |
450 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) |
|
450 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) | |
451 |
|
451 | |||
452 | user = relationship('User', lazy='joined') |
|
452 | user = relationship('User', lazy='joined') | |
453 | users_group = relationship('UsersGroup') |
|
453 | users_group = relationship('UsersGroup') | |
454 |
|
454 | |||
455 | def __init__(self, gr_id='', u_id=''): |
|
455 | def __init__(self, gr_id='', u_id=''): | |
456 | self.users_group_id = gr_id |
|
456 | self.users_group_id = gr_id | |
457 | self.user_id = u_id |
|
457 | self.user_id = u_id | |
458 |
|
458 | |||
459 |
|
459 | |||
460 | class Repository(Base, BaseModel): |
|
460 | class Repository(Base, BaseModel): | |
461 | __tablename__ = 'repositories' |
|
461 | __tablename__ = 'repositories' | |
462 | __table_args__ = ( |
|
462 | __table_args__ = ( | |
463 | UniqueConstraint('repo_name'), |
|
463 | UniqueConstraint('repo_name'), | |
464 | {'extend_existing': True}, |
|
464 | {'extend_existing': True}, | |
465 | ) |
|
465 | ) | |
466 |
|
466 | |||
467 | repo_id = Column("repo_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
467 | repo_id = Column("repo_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
468 | repo_name = Column("repo_name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=False, unique=True, default=None) |
|
468 | repo_name = Column("repo_name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=False, unique=True, default=None) | |
469 | clone_uri = Column("clone_uri", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=False, default=None) |
|
469 | clone_uri = Column("clone_uri", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=False, default=None) | |
470 | repo_type = Column("repo_type", String(length=255, convert_unicode=False, assert_unicode=None), nullable=False, unique=False, default='hg') |
|
470 | repo_type = Column("repo_type", String(length=255, convert_unicode=False, assert_unicode=None), nullable=False, unique=False, default='hg') | |
471 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=False, default=None) |
|
471 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=False, default=None) | |
472 | private = Column("private", Boolean(), nullable=True, unique=None, default=None) |
|
472 | private = Column("private", Boolean(), nullable=True, unique=None, default=None) | |
473 | enable_statistics = Column("statistics", Boolean(), nullable=True, unique=None, default=True) |
|
473 | enable_statistics = Column("statistics", Boolean(), nullable=True, unique=None, default=True) | |
474 | enable_downloads = Column("downloads", Boolean(), nullable=True, unique=None, default=True) |
|
474 | enable_downloads = Column("downloads", Boolean(), nullable=True, unique=None, default=True) | |
475 | description = Column("description", String(length=10000, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
475 | description = Column("description", String(length=10000, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
476 | created_on = Column('created_on', DateTime(timezone=False), nullable=True, unique=None, default=datetime.datetime.now) |
|
476 | created_on = Column('created_on', DateTime(timezone=False), nullable=True, unique=None, default=datetime.datetime.now) | |
477 |
|
477 | |||
478 | fork_id = Column("fork_id", Integer(), ForeignKey('repositories.repo_id'), nullable=True, unique=False, default=None) |
|
478 | fork_id = Column("fork_id", Integer(), ForeignKey('repositories.repo_id'), nullable=True, unique=False, default=None) | |
479 | group_id = Column("group_id", Integer(), ForeignKey('groups.group_id'), nullable=True, unique=False, default=None) |
|
479 | group_id = Column("group_id", Integer(), ForeignKey('groups.group_id'), nullable=True, unique=False, default=None) | |
480 |
|
480 | |||
481 | user = relationship('User') |
|
481 | user = relationship('User') | |
482 | fork = relationship('Repository', remote_side=repo_id) |
|
482 | fork = relationship('Repository', remote_side=repo_id) | |
483 | group = relationship('RepoGroup') |
|
483 | group = relationship('RepoGroup') | |
484 | repo_to_perm = relationship('UserRepoToPerm', cascade='all', order_by='UserRepoToPerm.repo_to_perm_id') |
|
484 | repo_to_perm = relationship('UserRepoToPerm', cascade='all', order_by='UserRepoToPerm.repo_to_perm_id') | |
485 | users_group_to_perm = relationship('UsersGroupRepoToPerm', cascade='all') |
|
485 | users_group_to_perm = relationship('UsersGroupRepoToPerm', cascade='all') | |
486 | stats = relationship('Statistics', cascade='all', uselist=False) |
|
486 | stats = relationship('Statistics', cascade='all', uselist=False) | |
487 |
|
487 | |||
488 | followers = relationship('UserFollowing', primaryjoin='UserFollowing.follows_repo_id==Repository.repo_id', cascade='all') |
|
488 | followers = relationship('UserFollowing', primaryjoin='UserFollowing.follows_repo_id==Repository.repo_id', cascade='all') | |
489 |
|
489 | |||
490 | logs = relationship('UserLog') |
|
490 | logs = relationship('UserLog') | |
491 |
|
491 | |||
492 | def __repr__(self): |
|
492 | def __repr__(self): | |
493 | return "<%s('%s:%s')>" % (self.__class__.__name__, |
|
493 | return "<%s('%s:%s')>" % (self.__class__.__name__, | |
494 | self.repo_id, self.repo_name) |
|
494 | self.repo_id, self.repo_name) | |
495 |
|
495 | |||
496 | @classmethod |
|
496 | @classmethod | |
497 | def url_sep(cls): |
|
497 | def url_sep(cls): | |
498 | return '/' |
|
498 | return '/' | |
499 |
|
499 | |||
500 | @classmethod |
|
500 | @classmethod | |
501 | def get_by_repo_name(cls, repo_name): |
|
501 | def get_by_repo_name(cls, repo_name): | |
502 | q = Session.query(cls).filter(cls.repo_name == repo_name) |
|
502 | q = Session.query(cls).filter(cls.repo_name == repo_name) | |
503 | q = q.options(joinedload(Repository.fork))\ |
|
503 | q = q.options(joinedload(Repository.fork))\ | |
504 | .options(joinedload(Repository.user))\ |
|
504 | .options(joinedload(Repository.user))\ | |
505 | .options(joinedload(Repository.group)) |
|
505 | .options(joinedload(Repository.group)) | |
506 | return q.scalar() |
|
506 | return q.scalar() | |
507 |
|
507 | |||
508 | @classmethod |
|
508 | @classmethod | |
509 | def get_repo_forks(cls, repo_id): |
|
509 | def get_repo_forks(cls, repo_id): | |
510 | return cls.query().filter(Repository.fork_id == repo_id) |
|
510 | return cls.query().filter(Repository.fork_id == repo_id) | |
511 |
|
511 | |||
512 | @classmethod |
|
512 | @classmethod | |
513 | def base_path(cls): |
|
513 | def base_path(cls): | |
514 | """ |
|
514 | """ | |
515 | Returns base path when all repos are stored |
|
515 | Returns base path when all repos are stored | |
516 |
|
516 | |||
517 | :param cls: |
|
517 | :param cls: | |
518 | """ |
|
518 | """ | |
519 | q = Session.query(RhodeCodeUi)\ |
|
519 | q = Session.query(RhodeCodeUi)\ | |
520 | .filter(RhodeCodeUi.ui_key == cls.url_sep()) |
|
520 | .filter(RhodeCodeUi.ui_key == cls.url_sep()) | |
521 | q = q.options(FromCache("sql_cache_short", "repository_repo_path")) |
|
521 | q = q.options(FromCache("sql_cache_short", "repository_repo_path")) | |
522 | return q.one().ui_value |
|
522 | return q.one().ui_value | |
523 |
|
523 | |||
524 | @property |
|
524 | @property | |
525 | def just_name(self): |
|
525 | def just_name(self): | |
526 | return self.repo_name.split(Repository.url_sep())[-1] |
|
526 | return self.repo_name.split(Repository.url_sep())[-1] | |
527 |
|
527 | |||
528 | @property |
|
528 | @property | |
529 | def groups_with_parents(self): |
|
529 | def groups_with_parents(self): | |
530 | groups = [] |
|
530 | groups = [] | |
531 | if self.group is None: |
|
531 | if self.group is None: | |
532 | return groups |
|
532 | return groups | |
533 |
|
533 | |||
534 | cur_gr = self.group |
|
534 | cur_gr = self.group | |
535 | groups.insert(0, cur_gr) |
|
535 | groups.insert(0, cur_gr) | |
536 | while 1: |
|
536 | while 1: | |
537 | gr = getattr(cur_gr, 'parent_group', None) |
|
537 | gr = getattr(cur_gr, 'parent_group', None) | |
538 | cur_gr = cur_gr.parent_group |
|
538 | cur_gr = cur_gr.parent_group | |
539 | if gr is None: |
|
539 | if gr is None: | |
540 | break |
|
540 | break | |
541 | groups.insert(0, gr) |
|
541 | groups.insert(0, gr) | |
542 |
|
542 | |||
543 | return groups |
|
543 | return groups | |
544 |
|
544 | |||
545 | @property |
|
545 | @property | |
546 | def groups_and_repo(self): |
|
546 | def groups_and_repo(self): | |
547 | return self.groups_with_parents, self.just_name |
|
547 | return self.groups_with_parents, self.just_name | |
548 |
|
548 | |||
549 | @LazyProperty |
|
549 | @LazyProperty | |
550 | def repo_path(self): |
|
550 | def repo_path(self): | |
551 | """ |
|
551 | """ | |
552 | Returns base full path for that repository means where it actually |
|
552 | Returns base full path for that repository means where it actually | |
553 | exists on a filesystem |
|
553 | exists on a filesystem | |
554 | """ |
|
554 | """ | |
555 | q = Session.query(RhodeCodeUi).filter(RhodeCodeUi.ui_key == |
|
555 | q = Session.query(RhodeCodeUi).filter(RhodeCodeUi.ui_key == | |
556 | Repository.url_sep()) |
|
556 | Repository.url_sep()) | |
557 | q = q.options(FromCache("sql_cache_short", "repository_repo_path")) |
|
557 | q = q.options(FromCache("sql_cache_short", "repository_repo_path")) | |
558 | return q.one().ui_value |
|
558 | return q.one().ui_value | |
559 |
|
559 | |||
560 | @property |
|
560 | @property | |
561 | def repo_full_path(self): |
|
561 | def repo_full_path(self): | |
562 | p = [self.repo_path] |
|
562 | p = [self.repo_path] | |
563 | # we need to split the name by / since this is how we store the |
|
563 | # we need to split the name by / since this is how we store the | |
564 | # names in the database, but that eventually needs to be converted |
|
564 | # names in the database, but that eventually needs to be converted | |
565 | # into a valid system path |
|
565 | # into a valid system path | |
566 | p += self.repo_name.split(Repository.url_sep()) |
|
566 | p += self.repo_name.split(Repository.url_sep()) | |
567 | return os.path.join(*p) |
|
567 | return os.path.join(*p) | |
568 |
|
568 | |||
569 | def get_new_name(self, repo_name): |
|
569 | def get_new_name(self, repo_name): | |
570 | """ |
|
570 | """ | |
571 | returns new full repository name based on assigned group and new new |
|
571 | returns new full repository name based on assigned group and new new | |
572 |
|
572 | |||
573 | :param group_name: |
|
573 | :param group_name: | |
574 | """ |
|
574 | """ | |
575 | path_prefix = self.group.full_path_splitted if self.group else [] |
|
575 | path_prefix = self.group.full_path_splitted if self.group else [] | |
576 | return Repository.url_sep().join(path_prefix + [repo_name]) |
|
576 | return Repository.url_sep().join(path_prefix + [repo_name]) | |
577 |
|
577 | |||
578 | @property |
|
578 | @property | |
579 | def _ui(self): |
|
579 | def _ui(self): | |
580 | """ |
|
580 | """ | |
581 | Creates an db based ui object for this repository |
|
581 | Creates an db based ui object for this repository | |
582 | """ |
|
582 | """ | |
583 | from mercurial import ui |
|
583 | from mercurial import ui | |
584 | from mercurial import config |
|
584 | from mercurial import config | |
585 | baseui = ui.ui() |
|
585 | baseui = ui.ui() | |
586 |
|
586 | |||
587 | #clean the baseui object |
|
587 | #clean the baseui object | |
588 | baseui._ocfg = config.config() |
|
588 | baseui._ocfg = config.config() | |
589 | baseui._ucfg = config.config() |
|
589 | baseui._ucfg = config.config() | |
590 | baseui._tcfg = config.config() |
|
590 | baseui._tcfg = config.config() | |
591 |
|
591 | |||
592 | ret = RhodeCodeUi.query()\ |
|
592 | ret = RhodeCodeUi.query()\ | |
593 | .options(FromCache("sql_cache_short", "repository_repo_ui")).all() |
|
593 | .options(FromCache("sql_cache_short", "repository_repo_ui")).all() | |
594 |
|
594 | |||
595 | hg_ui = ret |
|
595 | hg_ui = ret | |
596 | for ui_ in hg_ui: |
|
596 | for ui_ in hg_ui: | |
597 | if ui_.ui_active: |
|
597 | if ui_.ui_active: | |
598 | log.debug('settings ui from db[%s]%s:%s', ui_.ui_section, |
|
598 | log.debug('settings ui from db[%s]%s:%s', ui_.ui_section, | |
599 | ui_.ui_key, ui_.ui_value) |
|
599 | ui_.ui_key, ui_.ui_value) | |
600 | baseui.setconfig(ui_.ui_section, ui_.ui_key, ui_.ui_value) |
|
600 | baseui.setconfig(ui_.ui_section, ui_.ui_key, ui_.ui_value) | |
601 |
|
601 | |||
602 | return baseui |
|
602 | return baseui | |
603 |
|
603 | |||
604 | @classmethod |
|
604 | @classmethod | |
605 | def is_valid(cls, repo_name): |
|
605 | def is_valid(cls, repo_name): | |
606 | """ |
|
606 | """ | |
607 | returns True if given repo name is a valid filesystem repository |
|
607 | returns True if given repo name is a valid filesystem repository | |
608 |
|
608 | |||
609 | :param cls: |
|
609 | :param cls: | |
610 | :param repo_name: |
|
610 | :param repo_name: | |
611 | """ |
|
611 | """ | |
612 | from rhodecode.lib.utils import is_valid_repo |
|
612 | from rhodecode.lib.utils import is_valid_repo | |
613 |
|
613 | |||
614 | return is_valid_repo(repo_name, cls.base_path()) |
|
614 | return is_valid_repo(repo_name, cls.base_path()) | |
615 |
|
615 | |||
616 | #========================================================================== |
|
616 | #========================================================================== | |
617 | # SCM PROPERTIES |
|
617 | # SCM PROPERTIES | |
618 | #========================================================================== |
|
618 | #========================================================================== | |
619 |
|
619 | |||
620 | def get_changeset(self, rev): |
|
620 | def get_changeset(self, rev): | |
621 | return get_changeset_safe(self.scm_instance, rev) |
|
621 | return get_changeset_safe(self.scm_instance, rev) | |
622 |
|
622 | |||
623 | @property |
|
623 | @property | |
624 | def tip(self): |
|
624 | def tip(self): | |
625 | return self.get_changeset('tip') |
|
625 | return self.get_changeset('tip') | |
626 |
|
626 | |||
627 | @property |
|
627 | @property | |
628 | def author(self): |
|
628 | def author(self): | |
629 | return self.tip.author |
|
629 | return self.tip.author | |
630 |
|
630 | |||
631 | @property |
|
631 | @property | |
632 | def last_change(self): |
|
632 | def last_change(self): | |
633 | return self.scm_instance.last_change |
|
633 | return self.scm_instance.last_change | |
634 |
|
634 | |||
635 | def comments(self, revisions=None): |
|
635 | def comments(self, revisions=None): | |
636 | """ |
|
636 | """ | |
637 | Returns comments for this repository grouped by revisions |
|
637 | Returns comments for this repository grouped by revisions | |
638 |
|
638 | |||
639 | :param revisions: filter query by revisions only |
|
639 | :param revisions: filter query by revisions only | |
640 | """ |
|
640 | """ | |
641 | cmts = ChangesetComment.query()\ |
|
641 | cmts = ChangesetComment.query()\ | |
642 | .filter(ChangesetComment.repo == self) |
|
642 | .filter(ChangesetComment.repo == self) | |
643 | if revisions: |
|
643 | if revisions: | |
644 | cmts = cmts.filter(ChangesetComment.revision.in_(revisions)) |
|
644 | cmts = cmts.filter(ChangesetComment.revision.in_(revisions)) | |
645 | grouped = defaultdict(list) |
|
645 | grouped = defaultdict(list) | |
646 | for cmt in cmts.all(): |
|
646 | for cmt in cmts.all(): | |
647 | grouped[cmt.revision].append(cmt) |
|
647 | grouped[cmt.revision].append(cmt) | |
648 | return grouped |
|
648 | return grouped | |
649 |
|
649 | |||
650 | #========================================================================== |
|
650 | #========================================================================== | |
651 | # SCM CACHE INSTANCE |
|
651 | # SCM CACHE INSTANCE | |
652 | #========================================================================== |
|
652 | #========================================================================== | |
653 |
|
653 | |||
654 | @property |
|
654 | @property | |
655 | def invalidate(self): |
|
655 | def invalidate(self): | |
656 | return CacheInvalidation.invalidate(self.repo_name) |
|
656 | return CacheInvalidation.invalidate(self.repo_name) | |
657 |
|
657 | |||
658 | def set_invalidate(self): |
|
658 | def set_invalidate(self): | |
659 | """ |
|
659 | """ | |
660 | set a cache for invalidation for this instance |
|
660 | set a cache for invalidation for this instance | |
661 | """ |
|
661 | """ | |
662 | CacheInvalidation.set_invalidate(self.repo_name) |
|
662 | CacheInvalidation.set_invalidate(self.repo_name) | |
663 |
|
663 | |||
664 | @LazyProperty |
|
664 | @LazyProperty | |
665 | def scm_instance(self): |
|
665 | def scm_instance(self): | |
666 | return self.__get_instance() |
|
666 | return self.__get_instance() | |
667 |
|
667 | |||
668 | @property |
|
668 | @property | |
669 | def scm_instance_cached(self): |
|
669 | def scm_instance_cached(self): | |
670 | @cache_region('long_term') |
|
670 | @cache_region('long_term') | |
671 | def _c(repo_name): |
|
671 | def _c(repo_name): | |
672 | return self.__get_instance() |
|
672 | return self.__get_instance() | |
673 | rn = self.repo_name |
|
673 | rn = self.repo_name | |
674 | log.debug('Getting cached instance of repo') |
|
674 | log.debug('Getting cached instance of repo') | |
675 | inv = self.invalidate |
|
675 | inv = self.invalidate | |
676 | if inv is not None: |
|
676 | if inv is not None: | |
677 | region_invalidate(_c, None, rn) |
|
677 | region_invalidate(_c, None, rn) | |
678 | # update our cache |
|
678 | # update our cache | |
679 | CacheInvalidation.set_valid(inv.cache_key) |
|
679 | CacheInvalidation.set_valid(inv.cache_key) | |
680 | return _c(rn) |
|
680 | return _c(rn) | |
681 |
|
681 | |||
682 | def __get_instance(self): |
|
682 | def __get_instance(self): | |
683 | repo_full_path = self.repo_full_path |
|
683 | repo_full_path = self.repo_full_path | |
684 | try: |
|
684 | try: | |
685 | alias = get_scm(repo_full_path)[0] |
|
685 | alias = get_scm(repo_full_path)[0] | |
686 | log.debug('Creating instance of %s repository' % alias) |
|
686 | log.debug('Creating instance of %s repository' % alias) | |
687 | backend = get_backend(alias) |
|
687 | backend = get_backend(alias) | |
688 | except VCSError: |
|
688 | except VCSError: | |
689 | log.error(traceback.format_exc()) |
|
689 | log.error(traceback.format_exc()) | |
690 | log.error('Perhaps this repository is in db and not in ' |
|
690 | log.error('Perhaps this repository is in db and not in ' | |
691 | 'filesystem run rescan repositories with ' |
|
691 | 'filesystem run rescan repositories with ' | |
692 | '"destroy old data " option from admin panel') |
|
692 | '"destroy old data " option from admin panel') | |
693 | return |
|
693 | return | |
694 |
|
694 | |||
695 | if alias == 'hg': |
|
695 | if alias == 'hg': | |
696 |
|
696 | |||
697 | repo = backend(safe_str(repo_full_path), create=False, |
|
697 | repo = backend(safe_str(repo_full_path), create=False, | |
698 | baseui=self._ui) |
|
698 | baseui=self._ui) | |
699 | # skip hidden web repository |
|
699 | # skip hidden web repository | |
700 | if repo._get_hidden(): |
|
700 | if repo._get_hidden(): | |
701 | return |
|
701 | return | |
702 | else: |
|
702 | else: | |
703 | repo = backend(repo_full_path, create=False) |
|
703 | repo = backend(repo_full_path, create=False) | |
704 |
|
704 | |||
705 | return repo |
|
705 | return repo | |
706 |
|
706 | |||
707 |
|
707 | |||
708 | class RepoGroup(Base, BaseModel): |
|
708 | class RepoGroup(Base, BaseModel): | |
709 | __tablename__ = 'groups' |
|
709 | __tablename__ = 'groups' | |
710 | __table_args__ = ( |
|
710 | __table_args__ = ( | |
711 | UniqueConstraint('group_name', 'group_parent_id'), |
|
711 | UniqueConstraint('group_name', 'group_parent_id'), | |
712 | CheckConstraint('group_id != group_parent_id'), |
|
712 | CheckConstraint('group_id != group_parent_id'), | |
713 | {'extend_existing': True}, |
|
713 | {'extend_existing': True}, | |
714 | ) |
|
714 | ) | |
715 | __mapper_args__ = {'order_by': 'group_name'} |
|
715 | __mapper_args__ = {'order_by': 'group_name'} | |
716 |
|
716 | |||
717 | group_id = Column("group_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
717 | group_id = Column("group_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
718 | group_name = Column("group_name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=False, unique=True, default=None) |
|
718 | group_name = Column("group_name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=False, unique=True, default=None) | |
719 | group_parent_id = Column("group_parent_id", Integer(), ForeignKey('groups.group_id'), nullable=True, unique=None, default=None) |
|
719 | group_parent_id = Column("group_parent_id", Integer(), ForeignKey('groups.group_id'), nullable=True, unique=None, default=None) | |
720 | group_description = Column("group_description", String(length=10000, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
720 | group_description = Column("group_description", String(length=10000, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
721 |
|
721 | |||
722 | repo_group_to_perm = relationship('UserRepoGroupToPerm', cascade='all', order_by='UserRepoGroupToPerm.group_to_perm_id') |
|
722 | repo_group_to_perm = relationship('UserRepoGroupToPerm', cascade='all', order_by='UserRepoGroupToPerm.group_to_perm_id') | |
723 | users_group_to_perm = relationship('UsersGroupRepoGroupToPerm', cascade='all') |
|
723 | users_group_to_perm = relationship('UsersGroupRepoGroupToPerm', cascade='all') | |
724 |
|
724 | |||
725 | parent_group = relationship('RepoGroup', remote_side=group_id) |
|
725 | parent_group = relationship('RepoGroup', remote_side=group_id) | |
726 |
|
726 | |||
727 | def __init__(self, group_name='', parent_group=None): |
|
727 | def __init__(self, group_name='', parent_group=None): | |
728 | self.group_name = group_name |
|
728 | self.group_name = group_name | |
729 | self.parent_group = parent_group |
|
729 | self.parent_group = parent_group | |
730 |
|
730 | |||
731 | def __repr__(self): |
|
731 | def __repr__(self): | |
732 | return "<%s('%s:%s')>" % (self.__class__.__name__, self.group_id, |
|
732 | return "<%s('%s:%s')>" % (self.__class__.__name__, self.group_id, | |
733 | self.group_name) |
|
733 | self.group_name) | |
734 |
|
734 | |||
735 | @classmethod |
|
735 | @classmethod | |
736 | def groups_choices(cls): |
|
736 | def groups_choices(cls): | |
737 | from webhelpers.html import literal as _literal |
|
737 | from webhelpers.html import literal as _literal | |
738 | repo_groups = [('', '')] |
|
738 | repo_groups = [('', '')] | |
739 | sep = ' » ' |
|
739 | sep = ' » ' | |
740 | _name = lambda k: _literal(sep.join(k)) |
|
740 | _name = lambda k: _literal(sep.join(k)) | |
741 |
|
741 | |||
742 | repo_groups.extend([(x.group_id, _name(x.full_path_splitted)) |
|
742 | repo_groups.extend([(x.group_id, _name(x.full_path_splitted)) | |
743 | for x in cls.query().all()]) |
|
743 | for x in cls.query().all()]) | |
744 |
|
744 | |||
745 | repo_groups = sorted(repo_groups, key=lambda t: t[1].split(sep)[0]) |
|
745 | repo_groups = sorted(repo_groups, key=lambda t: t[1].split(sep)[0]) | |
746 | return repo_groups |
|
746 | return repo_groups | |
747 |
|
747 | |||
748 | @classmethod |
|
748 | @classmethod | |
749 | def url_sep(cls): |
|
749 | def url_sep(cls): | |
750 | return '/' |
|
750 | return '/' | |
751 |
|
751 | |||
752 | @classmethod |
|
752 | @classmethod | |
753 | def get_by_group_name(cls, group_name, cache=False, case_insensitive=False): |
|
753 | def get_by_group_name(cls, group_name, cache=False, case_insensitive=False): | |
754 | if case_insensitive: |
|
754 | if case_insensitive: | |
755 | gr = cls.query()\ |
|
755 | gr = cls.query()\ | |
756 | .filter(cls.group_name.ilike(group_name)) |
|
756 | .filter(cls.group_name.ilike(group_name)) | |
757 | else: |
|
757 | else: | |
758 | gr = cls.query()\ |
|
758 | gr = cls.query()\ | |
759 | .filter(cls.group_name == group_name) |
|
759 | .filter(cls.group_name == group_name) | |
760 | if cache: |
|
760 | if cache: | |
761 | gr = gr.options(FromCache( |
|
761 | gr = gr.options(FromCache( | |
762 | "sql_cache_short", |
|
762 | "sql_cache_short", | |
763 | "get_group_%s" % _hash_key(group_name) |
|
763 | "get_group_%s" % _hash_key(group_name) | |
764 | ) |
|
764 | ) | |
765 | ) |
|
765 | ) | |
766 | return gr.scalar() |
|
766 | return gr.scalar() | |
767 |
|
767 | |||
768 | @property |
|
768 | @property | |
769 | def parents(self): |
|
769 | def parents(self): | |
770 | parents_recursion_limit = 5 |
|
770 | parents_recursion_limit = 5 | |
771 | groups = [] |
|
771 | groups = [] | |
772 | if self.parent_group is None: |
|
772 | if self.parent_group is None: | |
773 | return groups |
|
773 | return groups | |
774 | cur_gr = self.parent_group |
|
774 | cur_gr = self.parent_group | |
775 | groups.insert(0, cur_gr) |
|
775 | groups.insert(0, cur_gr) | |
776 | cnt = 0 |
|
776 | cnt = 0 | |
777 | while 1: |
|
777 | while 1: | |
778 | cnt += 1 |
|
778 | cnt += 1 | |
779 | gr = getattr(cur_gr, 'parent_group', None) |
|
779 | gr = getattr(cur_gr, 'parent_group', None) | |
780 | cur_gr = cur_gr.parent_group |
|
780 | cur_gr = cur_gr.parent_group | |
781 | if gr is None: |
|
781 | if gr is None: | |
782 | break |
|
782 | break | |
783 | if cnt == parents_recursion_limit: |
|
783 | if cnt == parents_recursion_limit: | |
784 | # this will prevent accidental infinit loops |
|
784 | # this will prevent accidental infinit loops | |
785 | log.error('group nested more than %s' % |
|
785 | log.error('group nested more than %s' % | |
786 | parents_recursion_limit) |
|
786 | parents_recursion_limit) | |
787 | break |
|
787 | break | |
788 |
|
788 | |||
789 | groups.insert(0, gr) |
|
789 | groups.insert(0, gr) | |
790 | return groups |
|
790 | return groups | |
791 |
|
791 | |||
792 | @property |
|
792 | @property | |
793 | def children(self): |
|
793 | def children(self): | |
794 | return RepoGroup.query().filter(RepoGroup.parent_group == self) |
|
794 | return RepoGroup.query().filter(RepoGroup.parent_group == self) | |
795 |
|
795 | |||
796 | @property |
|
796 | @property | |
797 | def name(self): |
|
797 | def name(self): | |
798 | return self.group_name.split(RepoGroup.url_sep())[-1] |
|
798 | return self.group_name.split(RepoGroup.url_sep())[-1] | |
799 |
|
799 | |||
800 | @property |
|
800 | @property | |
801 | def full_path(self): |
|
801 | def full_path(self): | |
802 | return self.group_name |
|
802 | return self.group_name | |
803 |
|
803 | |||
804 | @property |
|
804 | @property | |
805 | def full_path_splitted(self): |
|
805 | def full_path_splitted(self): | |
806 | return self.group_name.split(RepoGroup.url_sep()) |
|
806 | return self.group_name.split(RepoGroup.url_sep()) | |
807 |
|
807 | |||
808 | @property |
|
808 | @property | |
809 | def repositories(self): |
|
809 | def repositories(self): | |
810 |
return Repository.query() |
|
810 | return Repository.query()\ | |
|
811 | .filter(Repository.group == self)\ | |||
|
812 | .order_by(Repository.repo_name) | |||
811 |
|
813 | |||
812 | @property |
|
814 | @property | |
813 | def repositories_recursive_count(self): |
|
815 | def repositories_recursive_count(self): | |
814 | cnt = self.repositories.count() |
|
816 | cnt = self.repositories.count() | |
815 |
|
817 | |||
816 | def children_count(group): |
|
818 | def children_count(group): | |
817 | cnt = 0 |
|
819 | cnt = 0 | |
818 | for child in group.children: |
|
820 | for child in group.children: | |
819 | cnt += child.repositories.count() |
|
821 | cnt += child.repositories.count() | |
820 | cnt += children_count(child) |
|
822 | cnt += children_count(child) | |
821 | return cnt |
|
823 | return cnt | |
822 |
|
824 | |||
823 | return cnt + children_count(self) |
|
825 | return cnt + children_count(self) | |
824 |
|
826 | |||
825 | def get_new_name(self, group_name): |
|
827 | def get_new_name(self, group_name): | |
826 | """ |
|
828 | """ | |
827 | returns new full group name based on parent and new name |
|
829 | returns new full group name based on parent and new name | |
828 |
|
830 | |||
829 | :param group_name: |
|
831 | :param group_name: | |
830 | """ |
|
832 | """ | |
831 | path_prefix = (self.parent_group.full_path_splitted if |
|
833 | path_prefix = (self.parent_group.full_path_splitted if | |
832 | self.parent_group else []) |
|
834 | self.parent_group else []) | |
833 | return RepoGroup.url_sep().join(path_prefix + [group_name]) |
|
835 | return RepoGroup.url_sep().join(path_prefix + [group_name]) | |
834 |
|
836 | |||
835 |
|
837 | |||
836 | class Permission(Base, BaseModel): |
|
838 | class Permission(Base, BaseModel): | |
837 | __tablename__ = 'permissions' |
|
839 | __tablename__ = 'permissions' | |
838 | __table_args__ = {'extend_existing': True} |
|
840 | __table_args__ = {'extend_existing': True} | |
839 | permission_id = Column("permission_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
841 | permission_id = Column("permission_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
840 | permission_name = Column("permission_name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
842 | permission_name = Column("permission_name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
841 | permission_longname = Column("permission_longname", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
843 | permission_longname = Column("permission_longname", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
842 |
|
844 | |||
843 | def __repr__(self): |
|
845 | def __repr__(self): | |
844 | return "<%s('%s:%s')>" % ( |
|
846 | return "<%s('%s:%s')>" % ( | |
845 | self.__class__.__name__, self.permission_id, self.permission_name |
|
847 | self.__class__.__name__, self.permission_id, self.permission_name | |
846 | ) |
|
848 | ) | |
847 |
|
849 | |||
848 | @classmethod |
|
850 | @classmethod | |
849 | def get_by_key(cls, key): |
|
851 | def get_by_key(cls, key): | |
850 | return cls.query().filter(cls.permission_name == key).scalar() |
|
852 | return cls.query().filter(cls.permission_name == key).scalar() | |
851 |
|
853 | |||
852 | @classmethod |
|
854 | @classmethod | |
853 | def get_default_perms(cls, default_user_id): |
|
855 | def get_default_perms(cls, default_user_id): | |
854 | q = Session.query(UserRepoToPerm, Repository, cls)\ |
|
856 | q = Session.query(UserRepoToPerm, Repository, cls)\ | |
855 | .join((Repository, UserRepoToPerm.repository_id == Repository.repo_id))\ |
|
857 | .join((Repository, UserRepoToPerm.repository_id == Repository.repo_id))\ | |
856 | .join((cls, UserRepoToPerm.permission_id == cls.permission_id))\ |
|
858 | .join((cls, UserRepoToPerm.permission_id == cls.permission_id))\ | |
857 | .filter(UserRepoToPerm.user_id == default_user_id) |
|
859 | .filter(UserRepoToPerm.user_id == default_user_id) | |
858 |
|
860 | |||
859 | return q.all() |
|
861 | return q.all() | |
860 |
|
862 | |||
861 | @classmethod |
|
863 | @classmethod | |
862 | def get_default_group_perms(cls, default_user_id): |
|
864 | def get_default_group_perms(cls, default_user_id): | |
863 | q = Session.query(UserRepoGroupToPerm, RepoGroup, cls)\ |
|
865 | q = Session.query(UserRepoGroupToPerm, RepoGroup, cls)\ | |
864 | .join((RepoGroup, UserRepoGroupToPerm.group_id == RepoGroup.group_id))\ |
|
866 | .join((RepoGroup, UserRepoGroupToPerm.group_id == RepoGroup.group_id))\ | |
865 | .join((cls, UserRepoGroupToPerm.permission_id == cls.permission_id))\ |
|
867 | .join((cls, UserRepoGroupToPerm.permission_id == cls.permission_id))\ | |
866 | .filter(UserRepoGroupToPerm.user_id == default_user_id) |
|
868 | .filter(UserRepoGroupToPerm.user_id == default_user_id) | |
867 |
|
869 | |||
868 | return q.all() |
|
870 | return q.all() | |
869 |
|
871 | |||
870 |
|
872 | |||
871 | class UserRepoToPerm(Base, BaseModel): |
|
873 | class UserRepoToPerm(Base, BaseModel): | |
872 | __tablename__ = 'repo_to_perm' |
|
874 | __tablename__ = 'repo_to_perm' | |
873 | __table_args__ = ( |
|
875 | __table_args__ = ( | |
874 | UniqueConstraint('user_id', 'repository_id', 'permission_id'), |
|
876 | UniqueConstraint('user_id', 'repository_id', 'permission_id'), | |
875 | {'extend_existing': True} |
|
877 | {'extend_existing': True} | |
876 | ) |
|
878 | ) | |
877 | repo_to_perm_id = Column("repo_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
879 | repo_to_perm_id = Column("repo_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
878 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) |
|
880 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) | |
879 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) |
|
881 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) | |
880 | repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=None, default=None) |
|
882 | repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=None, default=None) | |
881 |
|
883 | |||
882 | user = relationship('User') |
|
884 | user = relationship('User') | |
883 | repository = relationship('Repository') |
|
885 | repository = relationship('Repository') | |
884 | permission = relationship('Permission') |
|
886 | permission = relationship('Permission') | |
885 |
|
887 | |||
886 | @classmethod |
|
888 | @classmethod | |
887 | def create(cls, user, repository, permission): |
|
889 | def create(cls, user, repository, permission): | |
888 | n = cls() |
|
890 | n = cls() | |
889 | n.user = user |
|
891 | n.user = user | |
890 | n.repository = repository |
|
892 | n.repository = repository | |
891 | n.permission = permission |
|
893 | n.permission = permission | |
892 | Session.add(n) |
|
894 | Session.add(n) | |
893 | return n |
|
895 | return n | |
894 |
|
896 | |||
895 | def __repr__(self): |
|
897 | def __repr__(self): | |
896 | return '<user:%s => %s >' % (self.user, self.repository) |
|
898 | return '<user:%s => %s >' % (self.user, self.repository) | |
897 |
|
899 | |||
898 |
|
900 | |||
899 | class UserToPerm(Base, BaseModel): |
|
901 | class UserToPerm(Base, BaseModel): | |
900 | __tablename__ = 'user_to_perm' |
|
902 | __tablename__ = 'user_to_perm' | |
901 | __table_args__ = ( |
|
903 | __table_args__ = ( | |
902 | UniqueConstraint('user_id', 'permission_id'), |
|
904 | UniqueConstraint('user_id', 'permission_id'), | |
903 | {'extend_existing': True} |
|
905 | {'extend_existing': True} | |
904 | ) |
|
906 | ) | |
905 | user_to_perm_id = Column("user_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
907 | user_to_perm_id = Column("user_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
906 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) |
|
908 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) | |
907 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) |
|
909 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) | |
908 |
|
910 | |||
909 | user = relationship('User') |
|
911 | user = relationship('User') | |
910 | permission = relationship('Permission', lazy='joined') |
|
912 | permission = relationship('Permission', lazy='joined') | |
911 |
|
913 | |||
912 |
|
914 | |||
913 | class UsersGroupRepoToPerm(Base, BaseModel): |
|
915 | class UsersGroupRepoToPerm(Base, BaseModel): | |
914 | __tablename__ = 'users_group_repo_to_perm' |
|
916 | __tablename__ = 'users_group_repo_to_perm' | |
915 | __table_args__ = ( |
|
917 | __table_args__ = ( | |
916 | UniqueConstraint('repository_id', 'users_group_id', 'permission_id'), |
|
918 | UniqueConstraint('repository_id', 'users_group_id', 'permission_id'), | |
917 | {'extend_existing': True} |
|
919 | {'extend_existing': True} | |
918 | ) |
|
920 | ) | |
919 | users_group_to_perm_id = Column("users_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
921 | users_group_to_perm_id = Column("users_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
920 | users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None) |
|
922 | users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None) | |
921 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) |
|
923 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) | |
922 | repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=None, default=None) |
|
924 | repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=None, default=None) | |
923 |
|
925 | |||
924 | users_group = relationship('UsersGroup') |
|
926 | users_group = relationship('UsersGroup') | |
925 | permission = relationship('Permission') |
|
927 | permission = relationship('Permission') | |
926 | repository = relationship('Repository') |
|
928 | repository = relationship('Repository') | |
927 |
|
929 | |||
928 | @classmethod |
|
930 | @classmethod | |
929 | def create(cls, users_group, repository, permission): |
|
931 | def create(cls, users_group, repository, permission): | |
930 | n = cls() |
|
932 | n = cls() | |
931 | n.users_group = users_group |
|
933 | n.users_group = users_group | |
932 | n.repository = repository |
|
934 | n.repository = repository | |
933 | n.permission = permission |
|
935 | n.permission = permission | |
934 | Session.add(n) |
|
936 | Session.add(n) | |
935 | return n |
|
937 | return n | |
936 |
|
938 | |||
937 | def __repr__(self): |
|
939 | def __repr__(self): | |
938 | return '<userGroup:%s => %s >' % (self.users_group, self.repository) |
|
940 | return '<userGroup:%s => %s >' % (self.users_group, self.repository) | |
939 |
|
941 | |||
940 |
|
942 | |||
941 | class UsersGroupToPerm(Base, BaseModel): |
|
943 | class UsersGroupToPerm(Base, BaseModel): | |
942 | __tablename__ = 'users_group_to_perm' |
|
944 | __tablename__ = 'users_group_to_perm' | |
943 | __table_args__ = ( |
|
945 | __table_args__ = ( | |
944 | UniqueConstraint('users_group_id', 'permission_id',), |
|
946 | UniqueConstraint('users_group_id', 'permission_id',), | |
945 | {'extend_existing': True} |
|
947 | {'extend_existing': True} | |
946 | ) |
|
948 | ) | |
947 | users_group_to_perm_id = Column("users_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
949 | users_group_to_perm_id = Column("users_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
948 | users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None) |
|
950 | users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None) | |
949 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) |
|
951 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) | |
950 |
|
952 | |||
951 | users_group = relationship('UsersGroup') |
|
953 | users_group = relationship('UsersGroup') | |
952 | permission = relationship('Permission') |
|
954 | permission = relationship('Permission') | |
953 |
|
955 | |||
954 |
|
956 | |||
955 | class UserRepoGroupToPerm(Base, BaseModel): |
|
957 | class UserRepoGroupToPerm(Base, BaseModel): | |
956 | __tablename__ = 'user_repo_group_to_perm' |
|
958 | __tablename__ = 'user_repo_group_to_perm' | |
957 | __table_args__ = ( |
|
959 | __table_args__ = ( | |
958 | UniqueConstraint('user_id', 'group_id', 'permission_id'), |
|
960 | UniqueConstraint('user_id', 'group_id', 'permission_id'), | |
959 | {'extend_existing': True} |
|
961 | {'extend_existing': True} | |
960 | ) |
|
962 | ) | |
961 |
|
963 | |||
962 | group_to_perm_id = Column("group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
964 | group_to_perm_id = Column("group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
963 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) |
|
965 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) | |
964 | group_id = Column("group_id", Integer(), ForeignKey('groups.group_id'), nullable=False, unique=None, default=None) |
|
966 | group_id = Column("group_id", Integer(), ForeignKey('groups.group_id'), nullable=False, unique=None, default=None) | |
965 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) |
|
967 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) | |
966 |
|
968 | |||
967 | user = relationship('User') |
|
969 | user = relationship('User') | |
968 | group = relationship('RepoGroup') |
|
970 | group = relationship('RepoGroup') | |
969 | permission = relationship('Permission') |
|
971 | permission = relationship('Permission') | |
970 |
|
972 | |||
971 |
|
973 | |||
972 | class UsersGroupRepoGroupToPerm(Base, BaseModel): |
|
974 | class UsersGroupRepoGroupToPerm(Base, BaseModel): | |
973 | __tablename__ = 'users_group_repo_group_to_perm' |
|
975 | __tablename__ = 'users_group_repo_group_to_perm' | |
974 | __table_args__ = ( |
|
976 | __table_args__ = ( | |
975 | UniqueConstraint('users_group_id', 'group_id'), |
|
977 | UniqueConstraint('users_group_id', 'group_id'), | |
976 | {'extend_existing': True} |
|
978 | {'extend_existing': True} | |
977 | ) |
|
979 | ) | |
978 |
|
980 | |||
979 | users_group_repo_group_to_perm_id = Column("users_group_repo_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
981 | users_group_repo_group_to_perm_id = Column("users_group_repo_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
980 | users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None) |
|
982 | users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None) | |
981 | group_id = Column("group_id", Integer(), ForeignKey('groups.group_id'), nullable=False, unique=None, default=None) |
|
983 | group_id = Column("group_id", Integer(), ForeignKey('groups.group_id'), nullable=False, unique=None, default=None) | |
982 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) |
|
984 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) | |
983 |
|
985 | |||
984 | users_group = relationship('UsersGroup') |
|
986 | users_group = relationship('UsersGroup') | |
985 | permission = relationship('Permission') |
|
987 | permission = relationship('Permission') | |
986 | group = relationship('RepoGroup') |
|
988 | group = relationship('RepoGroup') | |
987 |
|
989 | |||
988 |
|
990 | |||
989 | class Statistics(Base, BaseModel): |
|
991 | class Statistics(Base, BaseModel): | |
990 | __tablename__ = 'statistics' |
|
992 | __tablename__ = 'statistics' | |
991 | __table_args__ = (UniqueConstraint('repository_id'), {'extend_existing': True}) |
|
993 | __table_args__ = (UniqueConstraint('repository_id'), {'extend_existing': True}) | |
992 | stat_id = Column("stat_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
994 | stat_id = Column("stat_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
993 | repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=True, default=None) |
|
995 | repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=True, default=None) | |
994 | stat_on_revision = Column("stat_on_revision", Integer(), nullable=False) |
|
996 | stat_on_revision = Column("stat_on_revision", Integer(), nullable=False) | |
995 | commit_activity = Column("commit_activity", LargeBinary(1000000), nullable=False)#JSON data |
|
997 | commit_activity = Column("commit_activity", LargeBinary(1000000), nullable=False)#JSON data | |
996 | commit_activity_combined = Column("commit_activity_combined", LargeBinary(), nullable=False)#JSON data |
|
998 | commit_activity_combined = Column("commit_activity_combined", LargeBinary(), nullable=False)#JSON data | |
997 | languages = Column("languages", LargeBinary(1000000), nullable=False)#JSON data |
|
999 | languages = Column("languages", LargeBinary(1000000), nullable=False)#JSON data | |
998 |
|
1000 | |||
999 | repository = relationship('Repository', single_parent=True) |
|
1001 | repository = relationship('Repository', single_parent=True) | |
1000 |
|
1002 | |||
1001 |
|
1003 | |||
1002 | class UserFollowing(Base, BaseModel): |
|
1004 | class UserFollowing(Base, BaseModel): | |
1003 | __tablename__ = 'user_followings' |
|
1005 | __tablename__ = 'user_followings' | |
1004 | __table_args__ = ( |
|
1006 | __table_args__ = ( | |
1005 | UniqueConstraint('user_id', 'follows_repository_id'), |
|
1007 | UniqueConstraint('user_id', 'follows_repository_id'), | |
1006 | UniqueConstraint('user_id', 'follows_user_id'), |
|
1008 | UniqueConstraint('user_id', 'follows_user_id'), | |
1007 | {'extend_existing': True} |
|
1009 | {'extend_existing': True} | |
1008 | ) |
|
1010 | ) | |
1009 |
|
1011 | |||
1010 | user_following_id = Column("user_following_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
1012 | user_following_id = Column("user_following_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
1011 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) |
|
1013 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) | |
1012 | follows_repo_id = Column("follows_repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=True, unique=None, default=None) |
|
1014 | follows_repo_id = Column("follows_repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=True, unique=None, default=None) | |
1013 | follows_user_id = Column("follows_user_id", Integer(), ForeignKey('users.user_id'), nullable=True, unique=None, default=None) |
|
1015 | follows_user_id = Column("follows_user_id", Integer(), ForeignKey('users.user_id'), nullable=True, unique=None, default=None) | |
1014 | follows_from = Column('follows_from', DateTime(timezone=False), nullable=True, unique=None, default=datetime.datetime.now) |
|
1016 | follows_from = Column('follows_from', DateTime(timezone=False), nullable=True, unique=None, default=datetime.datetime.now) | |
1015 |
|
1017 | |||
1016 | user = relationship('User', primaryjoin='User.user_id==UserFollowing.user_id') |
|
1018 | user = relationship('User', primaryjoin='User.user_id==UserFollowing.user_id') | |
1017 |
|
1019 | |||
1018 | follows_user = relationship('User', primaryjoin='User.user_id==UserFollowing.follows_user_id') |
|
1020 | follows_user = relationship('User', primaryjoin='User.user_id==UserFollowing.follows_user_id') | |
1019 | follows_repository = relationship('Repository', order_by='Repository.repo_name') |
|
1021 | follows_repository = relationship('Repository', order_by='Repository.repo_name') | |
1020 |
|
1022 | |||
1021 | @classmethod |
|
1023 | @classmethod | |
1022 | def get_repo_followers(cls, repo_id): |
|
1024 | def get_repo_followers(cls, repo_id): | |
1023 | return cls.query().filter(cls.follows_repo_id == repo_id) |
|
1025 | return cls.query().filter(cls.follows_repo_id == repo_id) | |
1024 |
|
1026 | |||
1025 |
|
1027 | |||
1026 | class CacheInvalidation(Base, BaseModel): |
|
1028 | class CacheInvalidation(Base, BaseModel): | |
1027 | __tablename__ = 'cache_invalidation' |
|
1029 | __tablename__ = 'cache_invalidation' | |
1028 | __table_args__ = (UniqueConstraint('cache_key'), {'extend_existing': True}) |
|
1030 | __table_args__ = (UniqueConstraint('cache_key'), {'extend_existing': True}) | |
1029 | cache_id = Column("cache_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
1031 | cache_id = Column("cache_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
1030 | cache_key = Column("cache_key", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
1032 | cache_key = Column("cache_key", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
1031 | cache_args = Column("cache_args", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) |
|
1033 | cache_args = Column("cache_args", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |
1032 | cache_active = Column("cache_active", Boolean(), nullable=True, unique=None, default=False) |
|
1034 | cache_active = Column("cache_active", Boolean(), nullable=True, unique=None, default=False) | |
1033 |
|
1035 | |||
1034 | def __init__(self, cache_key, cache_args=''): |
|
1036 | def __init__(self, cache_key, cache_args=''): | |
1035 | self.cache_key = cache_key |
|
1037 | self.cache_key = cache_key | |
1036 | self.cache_args = cache_args |
|
1038 | self.cache_args = cache_args | |
1037 | self.cache_active = False |
|
1039 | self.cache_active = False | |
1038 |
|
1040 | |||
1039 | def __repr__(self): |
|
1041 | def __repr__(self): | |
1040 | return "<%s('%s:%s')>" % (self.__class__.__name__, |
|
1042 | return "<%s('%s:%s')>" % (self.__class__.__name__, | |
1041 | self.cache_id, self.cache_key) |
|
1043 | self.cache_id, self.cache_key) | |
1042 |
|
1044 | |||
1043 | @classmethod |
|
1045 | @classmethod | |
1044 | def _get_key(cls, key): |
|
1046 | def _get_key(cls, key): | |
1045 | """ |
|
1047 | """ | |
1046 | Wrapper for generating a key |
|
1048 | Wrapper for generating a key | |
1047 |
|
1049 | |||
1048 | :param key: |
|
1050 | :param key: | |
1049 | """ |
|
1051 | """ | |
1050 | import rhodecode |
|
1052 | import rhodecode | |
1051 | prefix = '' |
|
1053 | prefix = '' | |
1052 | iid = rhodecode.CONFIG.get('instance_id') |
|
1054 | iid = rhodecode.CONFIG.get('instance_id') | |
1053 | if iid: |
|
1055 | if iid: | |
1054 | prefix = iid |
|
1056 | prefix = iid | |
1055 | return "%s%s" % (prefix, key) |
|
1057 | return "%s%s" % (prefix, key) | |
1056 |
|
1058 | |||
1057 | @classmethod |
|
1059 | @classmethod | |
1058 | def get_by_key(cls, key): |
|
1060 | def get_by_key(cls, key): | |
1059 | return cls.query().filter(cls.cache_key == key).scalar() |
|
1061 | return cls.query().filter(cls.cache_key == key).scalar() | |
1060 |
|
1062 | |||
1061 | @classmethod |
|
1063 | @classmethod | |
1062 | def invalidate(cls, key): |
|
1064 | def invalidate(cls, key): | |
1063 | """ |
|
1065 | """ | |
1064 | Returns Invalidation object if this given key should be invalidated |
|
1066 | Returns Invalidation object if this given key should be invalidated | |
1065 | None otherwise. `cache_active = False` means that this cache |
|
1067 | None otherwise. `cache_active = False` means that this cache | |
1066 | state is not valid and needs to be invalidated |
|
1068 | state is not valid and needs to be invalidated | |
1067 |
|
1069 | |||
1068 | :param key: |
|
1070 | :param key: | |
1069 | """ |
|
1071 | """ | |
1070 | return cls.query()\ |
|
1072 | return cls.query()\ | |
1071 | .filter(CacheInvalidation.cache_key == key)\ |
|
1073 | .filter(CacheInvalidation.cache_key == key)\ | |
1072 | .filter(CacheInvalidation.cache_active == False)\ |
|
1074 | .filter(CacheInvalidation.cache_active == False)\ | |
1073 | .scalar() |
|
1075 | .scalar() | |
1074 |
|
1076 | |||
1075 | @classmethod |
|
1077 | @classmethod | |
1076 | def set_invalidate(cls, key): |
|
1078 | def set_invalidate(cls, key): | |
1077 | """ |
|
1079 | """ | |
1078 | Mark this Cache key for invalidation |
|
1080 | Mark this Cache key for invalidation | |
1079 |
|
1081 | |||
1080 | :param key: |
|
1082 | :param key: | |
1081 | """ |
|
1083 | """ | |
1082 |
|
1084 | |||
1083 | log.debug('marking %s for invalidation' % key) |
|
1085 | log.debug('marking %s for invalidation' % key) | |
1084 | inv_obj = Session.query(cls)\ |
|
1086 | inv_obj = Session.query(cls)\ | |
1085 | .filter(cls.cache_key == key).scalar() |
|
1087 | .filter(cls.cache_key == key).scalar() | |
1086 | if inv_obj: |
|
1088 | if inv_obj: | |
1087 | inv_obj.cache_active = False |
|
1089 | inv_obj.cache_active = False | |
1088 | else: |
|
1090 | else: | |
1089 | log.debug('cache key not found in invalidation db -> creating one') |
|
1091 | log.debug('cache key not found in invalidation db -> creating one') | |
1090 | inv_obj = CacheInvalidation(key) |
|
1092 | inv_obj = CacheInvalidation(key) | |
1091 |
|
1093 | |||
1092 | try: |
|
1094 | try: | |
1093 | Session.add(inv_obj) |
|
1095 | Session.add(inv_obj) | |
1094 | Session.commit() |
|
1096 | Session.commit() | |
1095 | except Exception: |
|
1097 | except Exception: | |
1096 | log.error(traceback.format_exc()) |
|
1098 | log.error(traceback.format_exc()) | |
1097 | Session.rollback() |
|
1099 | Session.rollback() | |
1098 |
|
1100 | |||
1099 | @classmethod |
|
1101 | @classmethod | |
1100 | def set_valid(cls, key): |
|
1102 | def set_valid(cls, key): | |
1101 | """ |
|
1103 | """ | |
1102 | Mark this cache key as active and currently cached |
|
1104 | Mark this cache key as active and currently cached | |
1103 |
|
1105 | |||
1104 | :param key: |
|
1106 | :param key: | |
1105 | """ |
|
1107 | """ | |
1106 | inv_obj = cls.get_by_key(key) |
|
1108 | inv_obj = cls.get_by_key(key) | |
1107 | inv_obj.cache_active = True |
|
1109 | inv_obj.cache_active = True | |
1108 | Session.add(inv_obj) |
|
1110 | Session.add(inv_obj) | |
1109 | Session.commit() |
|
1111 | Session.commit() | |
1110 |
|
1112 | |||
1111 |
|
1113 | |||
1112 | class ChangesetComment(Base, BaseModel): |
|
1114 | class ChangesetComment(Base, BaseModel): | |
1113 | __tablename__ = 'changeset_comments' |
|
1115 | __tablename__ = 'changeset_comments' | |
1114 | __table_args__ = ({'extend_existing': True},) |
|
1116 | __table_args__ = ({'extend_existing': True},) | |
1115 | comment_id = Column('comment_id', Integer(), nullable=False, primary_key=True) |
|
1117 | comment_id = Column('comment_id', Integer(), nullable=False, primary_key=True) | |
1116 | repo_id = Column('repo_id', Integer(), ForeignKey('repositories.repo_id'), nullable=False) |
|
1118 | repo_id = Column('repo_id', Integer(), ForeignKey('repositories.repo_id'), nullable=False) | |
1117 | revision = Column('revision', String(40), nullable=False) |
|
1119 | revision = Column('revision', String(40), nullable=False) | |
1118 | line_no = Column('line_no', Unicode(10), nullable=True) |
|
1120 | line_no = Column('line_no', Unicode(10), nullable=True) | |
1119 | f_path = Column('f_path', Unicode(1000), nullable=True) |
|
1121 | f_path = Column('f_path', Unicode(1000), nullable=True) | |
1120 | user_id = Column('user_id', Integer(), ForeignKey('users.user_id'), nullable=False) |
|
1122 | user_id = Column('user_id', Integer(), ForeignKey('users.user_id'), nullable=False) | |
1121 | text = Column('text', Unicode(25000), nullable=False) |
|
1123 | text = Column('text', Unicode(25000), nullable=False) | |
1122 | modified_at = Column('modified_at', DateTime(), nullable=False, default=datetime.datetime.now) |
|
1124 | modified_at = Column('modified_at', DateTime(), nullable=False, default=datetime.datetime.now) | |
1123 |
|
1125 | |||
1124 | author = relationship('User', lazy='joined') |
|
1126 | author = relationship('User', lazy='joined') | |
1125 | repo = relationship('Repository') |
|
1127 | repo = relationship('Repository') | |
1126 |
|
1128 | |||
1127 | @classmethod |
|
1129 | @classmethod | |
1128 | def get_users(cls, revision): |
|
1130 | def get_users(cls, revision): | |
1129 | """ |
|
1131 | """ | |
1130 | Returns user associated with this changesetComment. ie those |
|
1132 | Returns user associated with this changesetComment. ie those | |
1131 | who actually commented |
|
1133 | who actually commented | |
1132 |
|
1134 | |||
1133 | :param cls: |
|
1135 | :param cls: | |
1134 | :param revision: |
|
1136 | :param revision: | |
1135 | """ |
|
1137 | """ | |
1136 | return Session.query(User)\ |
|
1138 | return Session.query(User)\ | |
1137 | .filter(cls.revision == revision)\ |
|
1139 | .filter(cls.revision == revision)\ | |
1138 | .join(ChangesetComment.author).all() |
|
1140 | .join(ChangesetComment.author).all() | |
1139 |
|
1141 | |||
1140 |
|
1142 | |||
1141 | class Notification(Base, BaseModel): |
|
1143 | class Notification(Base, BaseModel): | |
1142 | __tablename__ = 'notifications' |
|
1144 | __tablename__ = 'notifications' | |
1143 | __table_args__ = ({'extend_existing': True},) |
|
1145 | __table_args__ = ({'extend_existing': True},) | |
1144 |
|
1146 | |||
1145 | TYPE_CHANGESET_COMMENT = u'cs_comment' |
|
1147 | TYPE_CHANGESET_COMMENT = u'cs_comment' | |
1146 | TYPE_MESSAGE = u'message' |
|
1148 | TYPE_MESSAGE = u'message' | |
1147 | TYPE_MENTION = u'mention' |
|
1149 | TYPE_MENTION = u'mention' | |
1148 | TYPE_REGISTRATION = u'registration' |
|
1150 | TYPE_REGISTRATION = u'registration' | |
1149 |
|
1151 | |||
1150 | notification_id = Column('notification_id', Integer(), nullable=False, primary_key=True) |
|
1152 | notification_id = Column('notification_id', Integer(), nullable=False, primary_key=True) | |
1151 | subject = Column('subject', Unicode(512), nullable=True) |
|
1153 | subject = Column('subject', Unicode(512), nullable=True) | |
1152 | body = Column('body', Unicode(50000), nullable=True) |
|
1154 | body = Column('body', Unicode(50000), nullable=True) | |
1153 | created_by = Column("created_by", Integer(), ForeignKey('users.user_id'), nullable=True) |
|
1155 | created_by = Column("created_by", Integer(), ForeignKey('users.user_id'), nullable=True) | |
1154 | created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now) |
|
1156 | created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now) | |
1155 | type_ = Column('type', Unicode(256)) |
|
1157 | type_ = Column('type', Unicode(256)) | |
1156 |
|
1158 | |||
1157 | created_by_user = relationship('User') |
|
1159 | created_by_user = relationship('User') | |
1158 | notifications_to_users = relationship('UserNotification', lazy='joined', |
|
1160 | notifications_to_users = relationship('UserNotification', lazy='joined', | |
1159 | cascade="all, delete, delete-orphan") |
|
1161 | cascade="all, delete, delete-orphan") | |
1160 |
|
1162 | |||
1161 | @property |
|
1163 | @property | |
1162 | def recipients(self): |
|
1164 | def recipients(self): | |
1163 | return [x.user for x in UserNotification.query()\ |
|
1165 | return [x.user for x in UserNotification.query()\ | |
1164 | .filter(UserNotification.notification == self).all()] |
|
1166 | .filter(UserNotification.notification == self).all()] | |
1165 |
|
1167 | |||
1166 | @classmethod |
|
1168 | @classmethod | |
1167 | def create(cls, created_by, subject, body, recipients, type_=None): |
|
1169 | def create(cls, created_by, subject, body, recipients, type_=None): | |
1168 | if type_ is None: |
|
1170 | if type_ is None: | |
1169 | type_ = Notification.TYPE_MESSAGE |
|
1171 | type_ = Notification.TYPE_MESSAGE | |
1170 |
|
1172 | |||
1171 | notification = cls() |
|
1173 | notification = cls() | |
1172 | notification.created_by_user = created_by |
|
1174 | notification.created_by_user = created_by | |
1173 | notification.subject = subject |
|
1175 | notification.subject = subject | |
1174 | notification.body = body |
|
1176 | notification.body = body | |
1175 | notification.type_ = type_ |
|
1177 | notification.type_ = type_ | |
1176 | notification.created_on = datetime.datetime.now() |
|
1178 | notification.created_on = datetime.datetime.now() | |
1177 |
|
1179 | |||
1178 | for u in recipients: |
|
1180 | for u in recipients: | |
1179 | assoc = UserNotification() |
|
1181 | assoc = UserNotification() | |
1180 | assoc.notification = notification |
|
1182 | assoc.notification = notification | |
1181 | u.notifications.append(assoc) |
|
1183 | u.notifications.append(assoc) | |
1182 | Session.add(notification) |
|
1184 | Session.add(notification) | |
1183 | return notification |
|
1185 | return notification | |
1184 |
|
1186 | |||
1185 | @property |
|
1187 | @property | |
1186 | def description(self): |
|
1188 | def description(self): | |
1187 | from rhodecode.model.notification import NotificationModel |
|
1189 | from rhodecode.model.notification import NotificationModel | |
1188 | return NotificationModel().make_description(self) |
|
1190 | return NotificationModel().make_description(self) | |
1189 |
|
1191 | |||
1190 |
|
1192 | |||
1191 | class UserNotification(Base, BaseModel): |
|
1193 | class UserNotification(Base, BaseModel): | |
1192 | __tablename__ = 'user_to_notification' |
|
1194 | __tablename__ = 'user_to_notification' | |
1193 | __table_args__ = ( |
|
1195 | __table_args__ = ( | |
1194 | UniqueConstraint('user_id', 'notification_id'), |
|
1196 | UniqueConstraint('user_id', 'notification_id'), | |
1195 | {'extend_existing': True} |
|
1197 | {'extend_existing': True} | |
1196 | ) |
|
1198 | ) | |
1197 | user_id = Column('user_id', Integer(), ForeignKey('users.user_id'), primary_key=True) |
|
1199 | user_id = Column('user_id', Integer(), ForeignKey('users.user_id'), primary_key=True) | |
1198 | notification_id = Column("notification_id", Integer(), ForeignKey('notifications.notification_id'), primary_key=True) |
|
1200 | notification_id = Column("notification_id", Integer(), ForeignKey('notifications.notification_id'), primary_key=True) | |
1199 | read = Column('read', Boolean, default=False) |
|
1201 | read = Column('read', Boolean, default=False) | |
1200 | sent_on = Column('sent_on', DateTime(timezone=False), nullable=True, unique=None) |
|
1202 | sent_on = Column('sent_on', DateTime(timezone=False), nullable=True, unique=None) | |
1201 |
|
1203 | |||
1202 | user = relationship('User', lazy="joined") |
|
1204 | user = relationship('User', lazy="joined") | |
1203 | notification = relationship('Notification', lazy="joined", |
|
1205 | notification = relationship('Notification', lazy="joined", | |
1204 | order_by=lambda: Notification.created_on.desc(),) |
|
1206 | order_by=lambda: Notification.created_on.desc(),) | |
1205 |
|
1207 | |||
1206 | def mark_as_read(self): |
|
1208 | def mark_as_read(self): | |
1207 | self.read = True |
|
1209 | self.read = True | |
1208 | Session.add(self) |
|
1210 | Session.add(self) | |
1209 |
|
1211 | |||
1210 |
|
1212 | |||
1211 | class DbMigrateVersion(Base, BaseModel): |
|
1213 | class DbMigrateVersion(Base, BaseModel): | |
1212 | __tablename__ = 'db_migrate_version' |
|
1214 | __tablename__ = 'db_migrate_version' | |
1213 | __table_args__ = {'extend_existing': True} |
|
1215 | __table_args__ = {'extend_existing': True} | |
1214 | repository_id = Column('repository_id', String(250), primary_key=True) |
|
1216 | repository_id = Column('repository_id', String(250), primary_key=True) | |
1215 | repository_path = Column('repository_path', Text) |
|
1217 | repository_path = Column('repository_path', Text) | |
1216 | version = Column('version', Integer) |
|
1218 | version = Column('version', Integer) |
@@ -1,216 +1,225 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | """ |
|
2 | """ | |
3 | rhodecode.model.notification |
|
3 | rhodecode.model.notification | |
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
5 |
|
5 | |||
6 | Model for notifications |
|
6 | Model for notifications | |
7 |
|
7 | |||
8 |
|
8 | |||
9 | :created_on: Nov 20, 2011 |
|
9 | :created_on: Nov 20, 2011 | |
10 | :author: marcink |
|
10 | :author: marcink | |
11 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> |
|
11 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> | |
12 | :license: GPLv3, see COPYING for more details. |
|
12 | :license: GPLv3, see COPYING for more details. | |
13 | """ |
|
13 | """ | |
14 | # This program is free software: you can redistribute it and/or modify |
|
14 | # This program is free software: you can redistribute it and/or modify | |
15 | # it under the terms of the GNU General Public License as published by |
|
15 | # it under the terms of the GNU General Public License as published by | |
16 | # the Free Software Foundation, either version 3 of the License, or |
|
16 | # the Free Software Foundation, either version 3 of the License, or | |
17 | # (at your option) any later version. |
|
17 | # (at your option) any later version. | |
18 | # |
|
18 | # | |
19 | # This program is distributed in the hope that it will be useful, |
|
19 | # This program is distributed in the hope that it will be useful, | |
20 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
20 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
21 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
21 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
22 | # GNU General Public License for more details. |
|
22 | # GNU General Public License for more details. | |
23 | # |
|
23 | # | |
24 | # You should have received a copy of the GNU General Public License |
|
24 | # You should have received a copy of the GNU General Public License | |
25 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
25 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
26 |
|
26 | |||
27 | import os |
|
27 | import os | |
28 | import logging |
|
28 | import logging | |
29 | import traceback |
|
29 | import traceback | |
30 | import datetime |
|
30 | import datetime | |
31 |
|
31 | |||
32 | from pylons.i18n.translation import _ |
|
32 | from pylons.i18n.translation import _ | |
33 |
|
33 | |||
34 | import rhodecode |
|
34 | import rhodecode | |
35 | from rhodecode.lib import helpers as h |
|
35 | from rhodecode.lib import helpers as h | |
36 | from rhodecode.model import BaseModel |
|
36 | from rhodecode.model import BaseModel | |
37 | from rhodecode.model.db import Notification, User, UserNotification |
|
37 | from rhodecode.model.db import Notification, User, UserNotification | |
38 |
|
38 | |||
39 | log = logging.getLogger(__name__) |
|
39 | log = logging.getLogger(__name__) | |
40 |
|
40 | |||
41 |
|
41 | |||
42 | class NotificationModel(BaseModel): |
|
42 | class NotificationModel(BaseModel): | |
43 |
|
43 | |||
44 | def __get_user(self, user): |
|
44 | def __get_user(self, user): | |
45 | return self._get_instance(User, user, callback=User.get_by_username) |
|
45 | return self._get_instance(User, user, callback=User.get_by_username) | |
46 |
|
46 | |||
47 | def __get_notification(self, notification): |
|
47 | def __get_notification(self, notification): | |
48 | if isinstance(notification, Notification): |
|
48 | if isinstance(notification, Notification): | |
49 | return notification |
|
49 | return notification | |
50 | elif isinstance(notification, int): |
|
50 | elif isinstance(notification, int): | |
51 | return Notification.get(notification) |
|
51 | return Notification.get(notification) | |
52 | else: |
|
52 | else: | |
53 | if notification: |
|
53 | if notification: | |
54 | raise Exception('notification must be int or Instance' |
|
54 | raise Exception('notification must be int or Instance' | |
55 | ' of Notification got %s' % type(notification)) |
|
55 | ' of Notification got %s' % type(notification)) | |
56 |
|
56 | |||
57 | def create(self, created_by, subject, body, recipients=None, |
|
57 | def create(self, created_by, subject, body, recipients=None, | |
58 | type_=Notification.TYPE_MESSAGE, with_email=True, |
|
58 | type_=Notification.TYPE_MESSAGE, with_email=True, | |
59 | email_kwargs={}): |
|
59 | email_kwargs={}): | |
60 | """ |
|
60 | """ | |
61 |
|
61 | |||
62 | Creates notification of given type |
|
62 | Creates notification of given type | |
63 |
|
63 | |||
64 | :param created_by: int, str or User instance. User who created this |
|
64 | :param created_by: int, str or User instance. User who created this | |
65 | notification |
|
65 | notification | |
66 | :param subject: |
|
66 | :param subject: | |
67 | :param body: |
|
67 | :param body: | |
68 | :param recipients: list of int, str or User objects, when None |
|
68 | :param recipients: list of int, str or User objects, when None | |
69 | is given send to all admins |
|
69 | is given send to all admins | |
70 | :param type_: type of notification |
|
70 | :param type_: type of notification | |
71 | :param with_email: send email with this notification |
|
71 | :param with_email: send email with this notification | |
72 | :param email_kwargs: additional dict to pass as args to email template |
|
72 | :param email_kwargs: additional dict to pass as args to email template | |
73 | """ |
|
73 | """ | |
74 | from rhodecode.lib.celerylib import tasks, run_task |
|
74 | from rhodecode.lib.celerylib import tasks, run_task | |
75 |
|
75 | |||
76 | if recipients and not getattr(recipients, '__iter__', False): |
|
76 | if recipients and not getattr(recipients, '__iter__', False): | |
77 | raise Exception('recipients must be a list of iterable') |
|
77 | raise Exception('recipients must be a list of iterable') | |
78 |
|
78 | |||
79 | created_by_obj = self.__get_user(created_by) |
|
79 | created_by_obj = self.__get_user(created_by) | |
80 |
|
80 | |||
81 | if recipients: |
|
81 | if recipients: | |
82 | recipients_objs = [] |
|
82 | recipients_objs = [] | |
83 | for u in recipients: |
|
83 | for u in recipients: | |
84 | obj = self.__get_user(u) |
|
84 | obj = self.__get_user(u) | |
85 | if obj: |
|
85 | if obj: | |
86 | recipients_objs.append(obj) |
|
86 | recipients_objs.append(obj) | |
87 | recipients_objs = set(recipients_objs) |
|
87 | recipients_objs = set(recipients_objs) | |
|
88 | log.debug('sending notifications %s to %s' % ( | |||
|
89 | type_, recipients_objs) | |||
|
90 | ) | |||
88 | else: |
|
91 | else: | |
89 | # empty recipients means to all admins |
|
92 | # empty recipients means to all admins | |
90 | recipients_objs = User.query().filter(User.admin == True).all() |
|
93 | recipients_objs = User.query().filter(User.admin == True).all() | |
91 |
|
94 | log.debug('sending notifications %s to admins: %s' % ( | ||
92 | notif = Notification.create(created_by=created_by_obj, subject=subject, |
|
95 | type_, recipients_objs) | |
93 | body=body, recipients=recipients_objs, |
|
96 | ) | |
94 | type_=type_) |
|
97 | notif = Notification.create( | |
|
98 | created_by=created_by_obj, subject=subject, | |||
|
99 | body=body, recipients=recipients_objs, type_=type_ | |||
|
100 | ) | |||
95 |
|
101 | |||
96 | if with_email is False: |
|
102 | if with_email is False: | |
97 | return notif |
|
103 | return notif | |
98 |
|
104 | |||
99 | # send email with notification |
|
105 | # send email with notification | |
100 | for rec in recipients_objs: |
|
106 | for rec in recipients_objs: | |
101 | email_subject = NotificationModel().make_description(notif, False) |
|
107 | email_subject = NotificationModel().make_description(notif, False) | |
102 | type_ = type_ |
|
108 | type_ = type_ | |
103 | email_body = body |
|
109 | email_body = body | |
104 | kwargs = {'subject': subject, 'body': h.rst_w_mentions(body)} |
|
110 | kwargs = {'subject': subject, 'body': h.rst_w_mentions(body)} | |
105 | kwargs.update(email_kwargs) |
|
111 | kwargs.update(email_kwargs) | |
106 | email_body_html = EmailNotificationModel()\ |
|
112 | email_body_html = EmailNotificationModel()\ | |
107 | .get_email_tmpl(type_, **kwargs) |
|
113 | .get_email_tmpl(type_, **kwargs) | |
108 | run_task(tasks.send_email, rec.email, email_subject, email_body, |
|
114 | run_task(tasks.send_email, rec.email, email_subject, email_body, | |
109 | email_body_html) |
|
115 | email_body_html) | |
110 |
|
116 | |||
111 | return notif |
|
117 | return notif | |
112 |
|
118 | |||
113 | def delete(self, user, notification): |
|
119 | def delete(self, user, notification): | |
114 | # we don't want to remove actual notification just the assignment |
|
120 | # we don't want to remove actual notification just the assignment | |
115 | try: |
|
121 | try: | |
116 | notification = self.__get_notification(notification) |
|
122 | notification = self.__get_notification(notification) | |
117 | user = self.__get_user(user) |
|
123 | user = self.__get_user(user) | |
118 | if notification and user: |
|
124 | if notification and user: | |
119 | obj = UserNotification.query()\ |
|
125 | obj = UserNotification.query()\ | |
120 | .filter(UserNotification.user == user)\ |
|
126 | .filter(UserNotification.user == user)\ | |
121 | .filter(UserNotification.notification |
|
127 | .filter(UserNotification.notification | |
122 | == notification)\ |
|
128 | == notification)\ | |
123 | .one() |
|
129 | .one() | |
124 | self.sa.delete(obj) |
|
130 | self.sa.delete(obj) | |
125 | return True |
|
131 | return True | |
126 | except Exception: |
|
132 | except Exception: | |
127 | log.error(traceback.format_exc()) |
|
133 | log.error(traceback.format_exc()) | |
128 | raise |
|
134 | raise | |
129 |
|
135 | |||
130 | def get_for_user(self, user): |
|
136 | def get_for_user(self, user): | |
131 | user = self.__get_user(user) |
|
137 | user = self.__get_user(user) | |
132 | return user.notifications |
|
138 | return user.notifications | |
133 |
|
139 | |||
134 | def mark_all_read_for_user(self, user): |
|
140 | def mark_all_read_for_user(self, user): | |
135 | user = self.__get_user(user) |
|
141 | user = self.__get_user(user) | |
136 | UserNotification.query()\ |
|
142 | UserNotification.query()\ | |
137 | .filter(UserNotification.read==False)\ |
|
143 | .filter(UserNotification.read==False)\ | |
138 | .update({'read': True}) |
|
144 | .update({'read': True}) | |
139 |
|
145 | |||
140 | def get_unread_cnt_for_user(self, user): |
|
146 | def get_unread_cnt_for_user(self, user): | |
141 | user = self.__get_user(user) |
|
147 | user = self.__get_user(user) | |
142 | return UserNotification.query()\ |
|
148 | return UserNotification.query()\ | |
143 | .filter(UserNotification.read == False)\ |
|
149 | .filter(UserNotification.read == False)\ | |
144 | .filter(UserNotification.user == user).count() |
|
150 | .filter(UserNotification.user == user).count() | |
145 |
|
151 | |||
146 | def get_unread_for_user(self, user): |
|
152 | def get_unread_for_user(self, user): | |
147 | user = self.__get_user(user) |
|
153 | user = self.__get_user(user) | |
148 | return [x.notification for x in UserNotification.query()\ |
|
154 | return [x.notification for x in UserNotification.query()\ | |
149 | .filter(UserNotification.read == False)\ |
|
155 | .filter(UserNotification.read == False)\ | |
150 | .filter(UserNotification.user == user).all()] |
|
156 | .filter(UserNotification.user == user).all()] | |
151 |
|
157 | |||
152 | def get_user_notification(self, user, notification): |
|
158 | def get_user_notification(self, user, notification): | |
153 | user = self.__get_user(user) |
|
159 | user = self.__get_user(user) | |
154 | notification = self.__get_notification(notification) |
|
160 | notification = self.__get_notification(notification) | |
155 |
|
161 | |||
156 | return UserNotification.query()\ |
|
162 | return UserNotification.query()\ | |
157 | .filter(UserNotification.notification == notification)\ |
|
163 | .filter(UserNotification.notification == notification)\ | |
158 | .filter(UserNotification.user == user).scalar() |
|
164 | .filter(UserNotification.user == user).scalar() | |
159 |
|
165 | |||
160 | def make_description(self, notification, show_age=True): |
|
166 | def make_description(self, notification, show_age=True): | |
161 | """ |
|
167 | """ | |
162 | Creates a human readable description based on properties |
|
168 | Creates a human readable description based on properties | |
163 | of notification object |
|
169 | of notification object | |
164 | """ |
|
170 | """ | |
165 |
|
171 | |||
166 | _map = {notification.TYPE_CHANGESET_COMMENT:_('commented on commit'), |
|
172 | _map = { | |
|
173 | notification.TYPE_CHANGESET_COMMENT: _('commented on commit'), | |||
167 |
|
|
174 | notification.TYPE_MESSAGE: _('sent message'), | |
168 |
|
|
175 | notification.TYPE_MENTION: _('mentioned you'), | |
169 |
|
|
176 | notification.TYPE_REGISTRATION: _('registered in RhodeCode') | |
|
177 | } | |||
170 |
|
178 | |||
171 | DATETIME_FORMAT = "%Y-%m-%d %H:%M:%S" |
|
179 | DATETIME_FORMAT = "%Y-%m-%d %H:%M:%S" | |
172 |
|
180 | |||
173 | tmpl = "%(user)s %(action)s %(when)s" |
|
181 | tmpl = "%(user)s %(action)s %(when)s" | |
174 | if show_age: |
|
182 | if show_age: | |
175 | when = h.age(notification.created_on) |
|
183 | when = h.age(notification.created_on) | |
176 | else: |
|
184 | else: | |
177 | DTF = lambda d: datetime.datetime.strftime(d, DATETIME_FORMAT) |
|
185 | DTF = lambda d: datetime.datetime.strftime(d, DATETIME_FORMAT) | |
178 | when = DTF(notification.created_on) |
|
186 | when = DTF(notification.created_on) | |
179 | data = dict(user=notification.created_by_user.username, |
|
187 | data = dict( | |
180 | action=_map[notification.type_], |
|
188 | user=notification.created_by_user.username, | |
181 | when=when) |
|
189 | action=_map[notification.type_], when=when, | |
|
190 | ) | |||
182 | return tmpl % data |
|
191 | return tmpl % data | |
183 |
|
192 | |||
184 |
|
193 | |||
185 | class EmailNotificationModel(BaseModel): |
|
194 | class EmailNotificationModel(BaseModel): | |
186 |
|
195 | |||
187 | TYPE_CHANGESET_COMMENT = Notification.TYPE_CHANGESET_COMMENT |
|
196 | TYPE_CHANGESET_COMMENT = Notification.TYPE_CHANGESET_COMMENT | |
188 | TYPE_PASSWORD_RESET = 'passoword_link' |
|
197 | TYPE_PASSWORD_RESET = 'passoword_link' | |
189 | TYPE_REGISTRATION = Notification.TYPE_REGISTRATION |
|
198 | TYPE_REGISTRATION = Notification.TYPE_REGISTRATION | |
190 | TYPE_DEFAULT = 'default' |
|
199 | TYPE_DEFAULT = 'default' | |
191 |
|
200 | |||
192 | def __init__(self): |
|
201 | def __init__(self): | |
193 | self._template_root = rhodecode.CONFIG['pylons.paths']['templates'][0] |
|
202 | self._template_root = rhodecode.CONFIG['pylons.paths']['templates'][0] | |
194 | self._tmpl_lookup = rhodecode.CONFIG['pylons.app_globals'].mako_lookup |
|
203 | self._tmpl_lookup = rhodecode.CONFIG['pylons.app_globals'].mako_lookup | |
195 |
|
204 | |||
196 | self.email_types = { |
|
205 | self.email_types = { | |
197 |
|
|
206 | self.TYPE_CHANGESET_COMMENT: 'email_templates/changeset_comment.html', | |
198 |
|
|
207 | self.TYPE_PASSWORD_RESET: 'email_templates/password_reset.html', | |
199 |
|
|
208 | self.TYPE_REGISTRATION: 'email_templates/registration.html', | |
200 |
|
|
209 | self.TYPE_DEFAULT: 'email_templates/default.html' | |
201 | } |
|
210 | } | |
202 |
|
211 | |||
203 | def get_email_tmpl(self, type_, **kwargs): |
|
212 | def get_email_tmpl(self, type_, **kwargs): | |
204 | """ |
|
213 | """ | |
205 | return generated template for email based on given type |
|
214 | return generated template for email based on given type | |
206 |
|
215 | |||
207 | :param type_: |
|
216 | :param type_: | |
208 | """ |
|
217 | """ | |
209 |
|
218 | |||
210 | base = self.email_types.get(type_, self.email_types[self.TYPE_DEFAULT]) |
|
219 | base = self.email_types.get(type_, self.email_types[self.TYPE_DEFAULT]) | |
211 | email_template = self._tmpl_lookup.get_template(base) |
|
220 | email_template = self._tmpl_lookup.get_template(base) | |
212 | # translator inject |
|
221 | # translator inject | |
213 | _kwargs = {'_':_} |
|
222 | _kwargs = {'_': _} | |
214 | _kwargs.update(kwargs) |
|
223 | _kwargs.update(kwargs) | |
215 | log.debug('rendering tmpl %s with kwargs %s' % (base, _kwargs)) |
|
224 | log.debug('rendering tmpl %s with kwargs %s' % (base, _kwargs)) | |
216 | return email_template.render(**_kwargs) |
|
225 | return email_template.render(**_kwargs) |
@@ -1,552 +1,561 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | """ |
|
2 | """ | |
3 | rhodecode.model.user |
|
3 | rhodecode.model.user | |
4 | ~~~~~~~~~~~~~~~~~~~~ |
|
4 | ~~~~~~~~~~~~~~~~~~~~ | |
5 |
|
5 | |||
6 | users model for RhodeCode |
|
6 | users model for RhodeCode | |
7 |
|
7 | |||
8 | :created_on: Apr 9, 2010 |
|
8 | :created_on: Apr 9, 2010 | |
9 | :author: marcink |
|
9 | :author: marcink | |
10 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> |
|
10 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> | |
11 | :license: GPLv3, see COPYING for more details. |
|
11 | :license: GPLv3, see COPYING for more details. | |
12 | """ |
|
12 | """ | |
13 | # This program is free software: you can redistribute it and/or modify |
|
13 | # This program is free software: you can redistribute it and/or modify | |
14 | # it under the terms of the GNU General Public License as published by |
|
14 | # it under the terms of the GNU General Public License as published by | |
15 | # the Free Software Foundation, either version 3 of the License, or |
|
15 | # the Free Software Foundation, either version 3 of the License, or | |
16 | # (at your option) any later version. |
|
16 | # (at your option) any later version. | |
17 | # |
|
17 | # | |
18 | # This program is distributed in the hope that it will be useful, |
|
18 | # This program is distributed in the hope that it will be useful, | |
19 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
19 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
20 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
20 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
21 | # GNU General Public License for more details. |
|
21 | # GNU General Public License for more details. | |
22 | # |
|
22 | # | |
23 | # You should have received a copy of the GNU General Public License |
|
23 | # You should have received a copy of the GNU General Public License | |
24 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
24 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
25 |
|
25 | |||
26 | import logging |
|
26 | import logging | |
27 | import traceback |
|
27 | import traceback | |
28 |
|
28 | |||
29 | from pylons import url |
|
29 | from pylons import url | |
30 | from pylons.i18n.translation import _ |
|
30 | from pylons.i18n.translation import _ | |
31 |
|
31 | |||
32 | from rhodecode.lib import safe_unicode |
|
32 | from rhodecode.lib import safe_unicode | |
33 | from rhodecode.lib.caching_query import FromCache |
|
33 | from rhodecode.lib.caching_query import FromCache | |
34 |
|
34 | |||
35 | from rhodecode.model import BaseModel |
|
35 | from rhodecode.model import BaseModel | |
36 | from rhodecode.model.db import User, UserRepoToPerm, Repository, Permission, \ |
|
36 | from rhodecode.model.db import User, UserRepoToPerm, Repository, Permission, \ | |
37 | UserToPerm, UsersGroupRepoToPerm, UsersGroupToPerm, UsersGroupMember, \ |
|
37 | UserToPerm, UsersGroupRepoToPerm, UsersGroupToPerm, UsersGroupMember, \ | |
38 | Notification, RepoGroup, UserRepoGroupToPerm, UsersGroup |
|
38 | Notification, RepoGroup, UserRepoGroupToPerm, UsersGroup | |
39 | from rhodecode.lib.exceptions import DefaultUserException, \ |
|
39 | from rhodecode.lib.exceptions import DefaultUserException, \ | |
40 | UserOwnsReposException |
|
40 | UserOwnsReposException | |
41 |
|
41 | |||
42 | from sqlalchemy.exc import DatabaseError |
|
42 | from sqlalchemy.exc import DatabaseError | |
43 | from rhodecode.lib import generate_api_key |
|
43 | from rhodecode.lib import generate_api_key | |
44 | from sqlalchemy.orm import joinedload |
|
44 | from sqlalchemy.orm import joinedload | |
45 |
|
45 | |||
46 | log = logging.getLogger(__name__) |
|
46 | log = logging.getLogger(__name__) | |
47 |
|
47 | |||
48 |
|
48 | |||
49 | PERM_WEIGHTS = { |
|
49 | PERM_WEIGHTS = { | |
50 | 'repository.none': 0, |
|
50 | 'repository.none': 0, | |
51 | 'repository.read': 1, |
|
51 | 'repository.read': 1, | |
52 | 'repository.write': 3, |
|
52 | 'repository.write': 3, | |
53 | 'repository.admin': 4, |
|
53 | 'repository.admin': 4, | |
54 | 'group.none': 0, |
|
54 | 'group.none': 0, | |
55 | 'group.read': 1, |
|
55 | 'group.read': 1, | |
56 | 'group.write': 3, |
|
56 | 'group.write': 3, | |
57 | 'group.admin': 4, |
|
57 | 'group.admin': 4, | |
58 | } |
|
58 | } | |
59 |
|
59 | |||
60 |
|
60 | |||
61 | class UserModel(BaseModel): |
|
61 | class UserModel(BaseModel): | |
62 |
|
62 | |||
63 | def __get_user(self, user): |
|
63 | def __get_user(self, user): | |
64 | return self._get_instance(User, user, callback=User.get_by_username) |
|
64 | return self._get_instance(User, user, callback=User.get_by_username) | |
65 |
|
65 | |||
66 | def __get_perm(self, permission): |
|
66 | def __get_perm(self, permission): | |
67 | return self._get_instance(Permission, permission, |
|
67 | return self._get_instance(Permission, permission, | |
68 | callback=Permission.get_by_key) |
|
68 | callback=Permission.get_by_key) | |
69 |
|
69 | |||
70 | def get(self, user_id, cache=False): |
|
70 | def get(self, user_id, cache=False): | |
71 | user = self.sa.query(User) |
|
71 | user = self.sa.query(User) | |
72 | if cache: |
|
72 | if cache: | |
73 | user = user.options(FromCache("sql_cache_short", |
|
73 | user = user.options(FromCache("sql_cache_short", | |
74 | "get_user_%s" % user_id)) |
|
74 | "get_user_%s" % user_id)) | |
75 | return user.get(user_id) |
|
75 | return user.get(user_id) | |
76 |
|
76 | |||
77 | def get_user(self, user): |
|
77 | def get_user(self, user): | |
78 | return self.__get_user(user) |
|
78 | return self.__get_user(user) | |
79 |
|
79 | |||
80 | def get_by_username(self, username, cache=False, case_insensitive=False): |
|
80 | def get_by_username(self, username, cache=False, case_insensitive=False): | |
81 |
|
81 | |||
82 | if case_insensitive: |
|
82 | if case_insensitive: | |
83 | user = self.sa.query(User).filter(User.username.ilike(username)) |
|
83 | user = self.sa.query(User).filter(User.username.ilike(username)) | |
84 | else: |
|
84 | else: | |
85 | user = self.sa.query(User)\ |
|
85 | user = self.sa.query(User)\ | |
86 | .filter(User.username == username) |
|
86 | .filter(User.username == username) | |
87 | if cache: |
|
87 | if cache: | |
88 | user = user.options(FromCache("sql_cache_short", |
|
88 | user = user.options(FromCache("sql_cache_short", | |
89 | "get_user_%s" % username)) |
|
89 | "get_user_%s" % username)) | |
90 | return user.scalar() |
|
90 | return user.scalar() | |
91 |
|
91 | |||
92 | def get_by_api_key(self, api_key, cache=False): |
|
92 | def get_by_api_key(self, api_key, cache=False): | |
93 | return User.get_by_api_key(api_key, cache) |
|
93 | return User.get_by_api_key(api_key, cache) | |
94 |
|
94 | |||
95 | def create(self, form_data): |
|
95 | def create(self, form_data): | |
96 | try: |
|
96 | try: | |
97 | new_user = User() |
|
97 | new_user = User() | |
98 | for k, v in form_data.items(): |
|
98 | for k, v in form_data.items(): | |
99 | setattr(new_user, k, v) |
|
99 | setattr(new_user, k, v) | |
100 |
|
100 | |||
101 | new_user.api_key = generate_api_key(form_data['username']) |
|
101 | new_user.api_key = generate_api_key(form_data['username']) | |
102 | self.sa.add(new_user) |
|
102 | self.sa.add(new_user) | |
103 | return new_user |
|
103 | return new_user | |
104 | except: |
|
104 | except: | |
105 | log.error(traceback.format_exc()) |
|
105 | log.error(traceback.format_exc()) | |
106 | raise |
|
106 | raise | |
107 |
|
107 | |||
108 | def create_or_update(self, username, password, email, name, lastname, |
|
108 | def create_or_update(self, username, password, email, name, lastname, | |
109 | active=True, admin=False, ldap_dn=None): |
|
109 | active=True, admin=False, ldap_dn=None): | |
110 | """ |
|
110 | """ | |
111 | Creates a new instance if not found, or updates current one |
|
111 | Creates a new instance if not found, or updates current one | |
112 |
|
112 | |||
113 | :param username: |
|
113 | :param username: | |
114 | :param password: |
|
114 | :param password: | |
115 | :param email: |
|
115 | :param email: | |
116 | :param active: |
|
116 | :param active: | |
117 | :param name: |
|
117 | :param name: | |
118 | :param lastname: |
|
118 | :param lastname: | |
119 | :param active: |
|
119 | :param active: | |
120 | :param admin: |
|
120 | :param admin: | |
121 | :param ldap_dn: |
|
121 | :param ldap_dn: | |
122 | """ |
|
122 | """ | |
123 |
|
123 | |||
124 | from rhodecode.lib.auth import get_crypt_password |
|
124 | from rhodecode.lib.auth import get_crypt_password | |
125 |
|
125 | |||
126 | log.debug('Checking for %s account in RhodeCode database' % username) |
|
126 | log.debug('Checking for %s account in RhodeCode database' % username) | |
127 | user = User.get_by_username(username, case_insensitive=True) |
|
127 | user = User.get_by_username(username, case_insensitive=True) | |
128 | if user is None: |
|
128 | if user is None: | |
129 | log.debug('creating new user %s' % username) |
|
129 | log.debug('creating new user %s' % username) | |
130 | new_user = User() |
|
130 | new_user = User() | |
131 | else: |
|
131 | else: | |
132 | log.debug('updating user %s' % username) |
|
132 | log.debug('updating user %s' % username) | |
133 | new_user = user |
|
133 | new_user = user | |
134 |
|
134 | |||
135 | try: |
|
135 | try: | |
136 | new_user.username = username |
|
136 | new_user.username = username | |
137 | new_user.admin = admin |
|
137 | new_user.admin = admin | |
138 | new_user.password = get_crypt_password(password) |
|
138 | new_user.password = get_crypt_password(password) | |
139 | new_user.api_key = generate_api_key(username) |
|
139 | new_user.api_key = generate_api_key(username) | |
140 | new_user.email = email |
|
140 | new_user.email = email | |
141 | new_user.active = active |
|
141 | new_user.active = active | |
142 | new_user.ldap_dn = safe_unicode(ldap_dn) if ldap_dn else None |
|
142 | new_user.ldap_dn = safe_unicode(ldap_dn) if ldap_dn else None | |
143 | new_user.name = name |
|
143 | new_user.name = name | |
144 | new_user.lastname = lastname |
|
144 | new_user.lastname = lastname | |
145 | self.sa.add(new_user) |
|
145 | self.sa.add(new_user) | |
146 | return new_user |
|
146 | return new_user | |
147 | except (DatabaseError,): |
|
147 | except (DatabaseError,): | |
148 | log.error(traceback.format_exc()) |
|
148 | log.error(traceback.format_exc()) | |
149 | raise |
|
149 | raise | |
150 |
|
150 | |||
151 | def create_for_container_auth(self, username, attrs): |
|
151 | def create_for_container_auth(self, username, attrs): | |
152 | """ |
|
152 | """ | |
153 | Creates the given user if it's not already in the database |
|
153 | Creates the given user if it's not already in the database | |
154 |
|
154 | |||
155 | :param username: |
|
155 | :param username: | |
156 | :param attrs: |
|
156 | :param attrs: | |
157 | """ |
|
157 | """ | |
158 | if self.get_by_username(username, case_insensitive=True) is None: |
|
158 | if self.get_by_username(username, case_insensitive=True) is None: | |
159 |
|
159 | |||
160 | # autogenerate email for container account without one |
|
160 | # autogenerate email for container account without one | |
161 | generate_email = lambda usr: '%s@container_auth.account' % usr |
|
161 | generate_email = lambda usr: '%s@container_auth.account' % usr | |
162 |
|
162 | |||
163 | try: |
|
163 | try: | |
164 | new_user = User() |
|
164 | new_user = User() | |
165 | new_user.username = username |
|
165 | new_user.username = username | |
166 | new_user.password = None |
|
166 | new_user.password = None | |
167 | new_user.api_key = generate_api_key(username) |
|
167 | new_user.api_key = generate_api_key(username) | |
168 | new_user.email = attrs['email'] |
|
168 | new_user.email = attrs['email'] | |
169 | new_user.active = attrs.get('active', True) |
|
169 | new_user.active = attrs.get('active', True) | |
170 | new_user.name = attrs['name'] or generate_email(username) |
|
170 | new_user.name = attrs['name'] or generate_email(username) | |
171 | new_user.lastname = attrs['lastname'] |
|
171 | new_user.lastname = attrs['lastname'] | |
172 |
|
172 | |||
173 | self.sa.add(new_user) |
|
173 | self.sa.add(new_user) | |
174 | return new_user |
|
174 | return new_user | |
175 | except (DatabaseError,): |
|
175 | except (DatabaseError,): | |
176 | log.error(traceback.format_exc()) |
|
176 | log.error(traceback.format_exc()) | |
177 | self.sa.rollback() |
|
177 | self.sa.rollback() | |
178 | raise |
|
178 | raise | |
179 | log.debug('User %s already exists. Skipping creation of account' |
|
179 | log.debug('User %s already exists. Skipping creation of account' | |
180 | ' for container auth.', username) |
|
180 | ' for container auth.', username) | |
181 | return None |
|
181 | return None | |
182 |
|
182 | |||
183 | def create_ldap(self, username, password, user_dn, attrs): |
|
183 | def create_ldap(self, username, password, user_dn, attrs): | |
184 | """ |
|
184 | """ | |
185 | Checks if user is in database, if not creates this user marked |
|
185 | Checks if user is in database, if not creates this user marked | |
186 | as ldap user |
|
186 | as ldap user | |
187 |
|
187 | |||
188 | :param username: |
|
188 | :param username: | |
189 | :param password: |
|
189 | :param password: | |
190 | :param user_dn: |
|
190 | :param user_dn: | |
191 | :param attrs: |
|
191 | :param attrs: | |
192 | """ |
|
192 | """ | |
193 | from rhodecode.lib.auth import get_crypt_password |
|
193 | from rhodecode.lib.auth import get_crypt_password | |
194 | log.debug('Checking for such ldap account in RhodeCode database') |
|
194 | log.debug('Checking for such ldap account in RhodeCode database') | |
195 | if self.get_by_username(username, case_insensitive=True) is None: |
|
195 | if self.get_by_username(username, case_insensitive=True) is None: | |
196 |
|
196 | |||
197 | # autogenerate email for ldap account without one |
|
197 | # autogenerate email for ldap account without one | |
198 | generate_email = lambda usr: '%s@ldap.account' % usr |
|
198 | generate_email = lambda usr: '%s@ldap.account' % usr | |
199 |
|
199 | |||
200 | try: |
|
200 | try: | |
201 | new_user = User() |
|
201 | new_user = User() | |
202 | username = username.lower() |
|
202 | username = username.lower() | |
203 | # add ldap account always lowercase |
|
203 | # add ldap account always lowercase | |
204 | new_user.username = username |
|
204 | new_user.username = username | |
205 | new_user.password = get_crypt_password(password) |
|
205 | new_user.password = get_crypt_password(password) | |
206 | new_user.api_key = generate_api_key(username) |
|
206 | new_user.api_key = generate_api_key(username) | |
207 | new_user.email = attrs['email'] or generate_email(username) |
|
207 | new_user.email = attrs['email'] or generate_email(username) | |
208 | new_user.active = attrs.get('active', True) |
|
208 | new_user.active = attrs.get('active', True) | |
209 | new_user.ldap_dn = safe_unicode(user_dn) |
|
209 | new_user.ldap_dn = safe_unicode(user_dn) | |
210 | new_user.name = attrs['name'] |
|
210 | new_user.name = attrs['name'] | |
211 | new_user.lastname = attrs['lastname'] |
|
211 | new_user.lastname = attrs['lastname'] | |
212 |
|
212 | |||
213 | self.sa.add(new_user) |
|
213 | self.sa.add(new_user) | |
214 | return new_user |
|
214 | return new_user | |
215 | except (DatabaseError,): |
|
215 | except (DatabaseError,): | |
216 | log.error(traceback.format_exc()) |
|
216 | log.error(traceback.format_exc()) | |
217 | self.sa.rollback() |
|
217 | self.sa.rollback() | |
218 | raise |
|
218 | raise | |
219 | log.debug('this %s user exists skipping creation of ldap account', |
|
219 | log.debug('this %s user exists skipping creation of ldap account', | |
220 | username) |
|
220 | username) | |
221 | return None |
|
221 | return None | |
222 |
|
222 | |||
223 | def create_registration(self, form_data): |
|
223 | def create_registration(self, form_data): | |
224 | from rhodecode.model.notification import NotificationModel |
|
224 | from rhodecode.model.notification import NotificationModel | |
225 |
|
225 | |||
226 | try: |
|
226 | try: | |
227 | new_user = User() |
|
227 | new_user = User() | |
228 | for k, v in form_data.items(): |
|
228 | for k, v in form_data.items(): | |
229 | if k != 'admin': |
|
229 | if k != 'admin': | |
230 | setattr(new_user, k, v) |
|
230 | setattr(new_user, k, v) | |
231 |
|
231 | |||
232 | self.sa.add(new_user) |
|
232 | self.sa.add(new_user) | |
233 | self.sa.flush() |
|
233 | self.sa.flush() | |
234 |
|
234 | |||
235 | # notification to admins |
|
235 | # notification to admins | |
236 | subject = _('new user registration') |
|
236 | subject = _('new user registration') | |
237 | body = ('New user registration\n' |
|
237 | body = ('New user registration\n' | |
238 | '---------------------\n' |
|
238 | '---------------------\n' | |
239 | '- Username: %s\n' |
|
239 | '- Username: %s\n' | |
240 | '- Full Name: %s\n' |
|
240 | '- Full Name: %s\n' | |
241 | '- Email: %s\n') |
|
241 | '- Email: %s\n') | |
242 | body = body % (new_user.username, new_user.full_name, |
|
242 | body = body % (new_user.username, new_user.full_name, | |
243 | new_user.email) |
|
243 | new_user.email) | |
244 | edit_url = url('edit_user', id=new_user.user_id, qualified=True) |
|
244 | edit_url = url('edit_user', id=new_user.user_id, qualified=True) | |
245 | kw = {'registered_user_url': edit_url} |
|
245 | kw = {'registered_user_url': edit_url} | |
246 | NotificationModel().create(created_by=new_user, subject=subject, |
|
246 | NotificationModel().create(created_by=new_user, subject=subject, | |
247 | body=body, recipients=None, |
|
247 | body=body, recipients=None, | |
248 | type_=Notification.TYPE_REGISTRATION, |
|
248 | type_=Notification.TYPE_REGISTRATION, | |
249 | email_kwargs=kw) |
|
249 | email_kwargs=kw) | |
250 |
|
250 | |||
251 | except: |
|
251 | except: | |
252 | log.error(traceback.format_exc()) |
|
252 | log.error(traceback.format_exc()) | |
253 | raise |
|
253 | raise | |
254 |
|
254 | |||
255 | def update(self, user_id, form_data): |
|
255 | def update(self, user_id, form_data): | |
256 | try: |
|
256 | try: | |
257 | user = self.get(user_id, cache=False) |
|
257 | user = self.get(user_id, cache=False) | |
258 | if user.username == 'default': |
|
258 | if user.username == 'default': | |
259 | raise DefaultUserException( |
|
259 | raise DefaultUserException( | |
260 | _("You can't Edit this user since it's" |
|
260 | _("You can't Edit this user since it's" | |
261 | " crucial for entire application")) |
|
261 | " crucial for entire application")) | |
262 |
|
262 | |||
263 | for k, v in form_data.items(): |
|
263 | for k, v in form_data.items(): | |
264 | if k == 'new_password' and v != '': |
|
264 | if k == 'new_password' and v != '': | |
265 | user.password = v |
|
265 | user.password = v | |
266 | user.api_key = generate_api_key(user.username) |
|
266 | user.api_key = generate_api_key(user.username) | |
267 | else: |
|
267 | else: | |
268 | setattr(user, k, v) |
|
268 | setattr(user, k, v) | |
269 |
|
269 | |||
270 | self.sa.add(user) |
|
270 | self.sa.add(user) | |
271 | except: |
|
271 | except: | |
272 | log.error(traceback.format_exc()) |
|
272 | log.error(traceback.format_exc()) | |
273 | raise |
|
273 | raise | |
274 |
|
274 | |||
275 | def update_my_account(self, user_id, form_data): |
|
275 | def update_my_account(self, user_id, form_data): | |
276 | try: |
|
276 | try: | |
277 | user = self.get(user_id, cache=False) |
|
277 | user = self.get(user_id, cache=False) | |
278 | if user.username == 'default': |
|
278 | if user.username == 'default': | |
279 | raise DefaultUserException( |
|
279 | raise DefaultUserException( | |
280 | _("You can't Edit this user since it's" |
|
280 | _("You can't Edit this user since it's" | |
281 | " crucial for entire application")) |
|
281 | " crucial for entire application")) | |
282 | for k, v in form_data.items(): |
|
282 | for k, v in form_data.items(): | |
283 | if k == 'new_password' and v != '': |
|
283 | if k == 'new_password' and v != '': | |
284 | user.password = v |
|
284 | user.password = v | |
285 | user.api_key = generate_api_key(user.username) |
|
285 | user.api_key = generate_api_key(user.username) | |
286 | else: |
|
286 | else: | |
287 | if k not in ['admin', 'active']: |
|
287 | if k not in ['admin', 'active']: | |
288 | setattr(user, k, v) |
|
288 | setattr(user, k, v) | |
289 |
|
289 | |||
290 | self.sa.add(user) |
|
290 | self.sa.add(user) | |
291 | except: |
|
291 | except: | |
292 | log.error(traceback.format_exc()) |
|
292 | log.error(traceback.format_exc()) | |
293 | raise |
|
293 | raise | |
294 |
|
294 | |||
295 | def delete(self, user): |
|
295 | def delete(self, user): | |
296 | user = self.__get_user(user) |
|
296 | user = self.__get_user(user) | |
297 |
|
297 | |||
298 | try: |
|
298 | try: | |
299 | if user.username == 'default': |
|
299 | if user.username == 'default': | |
300 | raise DefaultUserException( |
|
300 | raise DefaultUserException( | |
301 | _("You can't remove this user since it's" |
|
301 | _("You can't remove this user since it's" | |
302 | " crucial for entire application")) |
|
302 | " crucial for entire application")) | |
303 | if user.repositories: |
|
303 | if user.repositories: | |
304 | raise UserOwnsReposException(_('This user still owns %s ' |
|
304 | raise UserOwnsReposException(_('This user still owns %s ' | |
305 | 'repositories and cannot be ' |
|
305 | 'repositories and cannot be ' | |
306 | 'removed. Switch owners or ' |
|
306 | 'removed. Switch owners or ' | |
307 | 'remove those repositories') \ |
|
307 | 'remove those repositories') \ | |
308 | % user.repositories) |
|
308 | % user.repositories) | |
309 | self.sa.delete(user) |
|
309 | self.sa.delete(user) | |
310 | except: |
|
310 | except: | |
311 | log.error(traceback.format_exc()) |
|
311 | log.error(traceback.format_exc()) | |
312 | raise |
|
312 | raise | |
313 |
|
313 | |||
314 | def reset_password_link(self, data): |
|
314 | def reset_password_link(self, data): | |
315 | from rhodecode.lib.celerylib import tasks, run_task |
|
315 | from rhodecode.lib.celerylib import tasks, run_task | |
316 | run_task(tasks.send_password_link, data['email']) |
|
316 | run_task(tasks.send_password_link, data['email']) | |
317 |
|
317 | |||
318 | def reset_password(self, data): |
|
318 | def reset_password(self, data): | |
319 | from rhodecode.lib.celerylib import tasks, run_task |
|
319 | from rhodecode.lib.celerylib import tasks, run_task | |
320 | run_task(tasks.reset_user_password, data['email']) |
|
320 | run_task(tasks.reset_user_password, data['email']) | |
321 |
|
321 | |||
322 | def fill_data(self, auth_user, user_id=None, api_key=None): |
|
322 | def fill_data(self, auth_user, user_id=None, api_key=None): | |
323 | """ |
|
323 | """ | |
324 | Fetches auth_user by user_id,or api_key if present. |
|
324 | Fetches auth_user by user_id,or api_key if present. | |
325 | Fills auth_user attributes with those taken from database. |
|
325 | Fills auth_user attributes with those taken from database. | |
326 | Additionally set's is_authenitated if lookup fails |
|
326 | Additionally set's is_authenitated if lookup fails | |
327 | present in database |
|
327 | present in database | |
328 |
|
328 | |||
329 | :param auth_user: instance of user to set attributes |
|
329 | :param auth_user: instance of user to set attributes | |
330 | :param user_id: user id to fetch by |
|
330 | :param user_id: user id to fetch by | |
331 | :param api_key: api key to fetch by |
|
331 | :param api_key: api key to fetch by | |
332 | """ |
|
332 | """ | |
333 | if user_id is None and api_key is None: |
|
333 | if user_id is None and api_key is None: | |
334 | raise Exception('You need to pass user_id or api_key') |
|
334 | raise Exception('You need to pass user_id or api_key') | |
335 |
|
335 | |||
336 | try: |
|
336 | try: | |
337 | if api_key: |
|
337 | if api_key: | |
338 | dbuser = self.get_by_api_key(api_key) |
|
338 | dbuser = self.get_by_api_key(api_key) | |
339 | else: |
|
339 | else: | |
340 | dbuser = self.get(user_id) |
|
340 | dbuser = self.get(user_id) | |
341 |
|
341 | |||
342 | if dbuser is not None and dbuser.active: |
|
342 | if dbuser is not None and dbuser.active: | |
343 | log.debug('filling %s data' % dbuser) |
|
343 | log.debug('filling %s data' % dbuser) | |
344 | for k, v in dbuser.get_dict().items(): |
|
344 | for k, v in dbuser.get_dict().items(): | |
345 | setattr(auth_user, k, v) |
|
345 | setattr(auth_user, k, v) | |
346 | else: |
|
346 | else: | |
347 | return False |
|
347 | return False | |
348 |
|
348 | |||
349 | except: |
|
349 | except: | |
350 | log.error(traceback.format_exc()) |
|
350 | log.error(traceback.format_exc()) | |
351 | auth_user.is_authenticated = False |
|
351 | auth_user.is_authenticated = False | |
352 | return False |
|
352 | return False | |
353 |
|
353 | |||
354 | return True |
|
354 | return True | |
355 |
|
355 | |||
356 | def fill_perms(self, user): |
|
356 | def fill_perms(self, user): | |
357 | """ |
|
357 | """ | |
358 | Fills user permission attribute with permissions taken from database |
|
358 | Fills user permission attribute with permissions taken from database | |
359 | works for permissions given for repositories, and for permissions that |
|
359 | works for permissions given for repositories, and for permissions that | |
360 | are granted to groups |
|
360 | are granted to groups | |
361 |
|
361 | |||
362 | :param user: user instance to fill his perms |
|
362 | :param user: user instance to fill his perms | |
363 | """ |
|
363 | """ | |
364 | RK = 'repositories' |
|
364 | RK = 'repositories' | |
365 | GK = 'repositories_groups' |
|
365 | GK = 'repositories_groups' | |
366 | GLOBAL = 'global' |
|
366 | GLOBAL = 'global' | |
367 | user.permissions[RK] = {} |
|
367 | user.permissions[RK] = {} | |
368 | user.permissions[GK] = {} |
|
368 | user.permissions[GK] = {} | |
369 | user.permissions[GLOBAL] = set() |
|
369 | user.permissions[GLOBAL] = set() | |
370 |
|
370 | |||
371 | #====================================================================== |
|
371 | #====================================================================== | |
372 | # fetch default permissions |
|
372 | # fetch default permissions | |
373 | #====================================================================== |
|
373 | #====================================================================== | |
374 | default_user = User.get_by_username('default', cache=True) |
|
374 | default_user = User.get_by_username('default', cache=True) | |
375 | default_user_id = default_user.user_id |
|
375 | default_user_id = default_user.user_id | |
376 |
|
376 | |||
377 | default_repo_perms = Permission.get_default_perms(default_user_id) |
|
377 | default_repo_perms = Permission.get_default_perms(default_user_id) | |
378 | default_repo_groups_perms = Permission.get_default_group_perms(default_user_id) |
|
378 | default_repo_groups_perms = Permission.get_default_group_perms(default_user_id) | |
379 |
|
379 | |||
380 | if user.is_admin: |
|
380 | if user.is_admin: | |
381 | #================================================================== |
|
381 | #================================================================== | |
382 | # admin user have all default rights for repositories |
|
382 | # admin user have all default rights for repositories | |
383 | # and groups set to admin |
|
383 | # and groups set to admin | |
384 | #================================================================== |
|
384 | #================================================================== | |
385 | user.permissions[GLOBAL].add('hg.admin') |
|
385 | user.permissions[GLOBAL].add('hg.admin') | |
386 |
|
386 | |||
387 | # repositories |
|
387 | # repositories | |
388 | for perm in default_repo_perms: |
|
388 | for perm in default_repo_perms: | |
389 | r_k = perm.UserRepoToPerm.repository.repo_name |
|
389 | r_k = perm.UserRepoToPerm.repository.repo_name | |
390 | p = 'repository.admin' |
|
390 | p = 'repository.admin' | |
391 | user.permissions[RK][r_k] = p |
|
391 | user.permissions[RK][r_k] = p | |
392 |
|
392 | |||
393 | # repositories groups |
|
393 | # repositories groups | |
394 | for perm in default_repo_groups_perms: |
|
394 | for perm in default_repo_groups_perms: | |
395 | rg_k = perm.UserRepoGroupToPerm.group.group_name |
|
395 | rg_k = perm.UserRepoGroupToPerm.group.group_name | |
396 | p = 'group.admin' |
|
396 | p = 'group.admin' | |
397 | user.permissions[GK][rg_k] = p |
|
397 | user.permissions[GK][rg_k] = p | |
398 |
|
398 | |||
399 | else: |
|
399 | else: | |
400 | #================================================================== |
|
400 | #================================================================== | |
401 | # set default permissions first for repositories and groups |
|
401 | # set default permissions first for repositories and groups | |
402 | #================================================================== |
|
402 | #================================================================== | |
403 | uid = user.user_id |
|
403 | uid = user.user_id | |
404 |
|
404 | |||
405 | # default global permissions |
|
405 | # default global permissions | |
406 | default_global_perms = self.sa.query(UserToPerm)\ |
|
406 | default_global_perms = self.sa.query(UserToPerm)\ | |
407 | .filter(UserToPerm.user_id == default_user_id) |
|
407 | .filter(UserToPerm.user_id == default_user_id) | |
408 |
|
408 | |||
409 | for perm in default_global_perms: |
|
409 | for perm in default_global_perms: | |
410 | user.permissions[GLOBAL].add(perm.permission.permission_name) |
|
410 | user.permissions[GLOBAL].add(perm.permission.permission_name) | |
411 |
|
411 | |||
412 | # default for repositories |
|
412 | # default for repositories | |
413 | for perm in default_repo_perms: |
|
413 | for perm in default_repo_perms: | |
414 | r_k = perm.UserRepoToPerm.repository.repo_name |
|
414 | r_k = perm.UserRepoToPerm.repository.repo_name | |
415 | if perm.Repository.private and not (perm.Repository.user_id == uid): |
|
415 | if perm.Repository.private and not (perm.Repository.user_id == uid): | |
416 | # disable defaults for private repos, |
|
416 | # disable defaults for private repos, | |
417 | p = 'repository.none' |
|
417 | p = 'repository.none' | |
418 | elif perm.Repository.user_id == uid: |
|
418 | elif perm.Repository.user_id == uid: | |
419 | # set admin if owner |
|
419 | # set admin if owner | |
420 | p = 'repository.admin' |
|
420 | p = 'repository.admin' | |
421 | else: |
|
421 | else: | |
422 | p = perm.Permission.permission_name |
|
422 | p = perm.Permission.permission_name | |
423 |
|
423 | |||
424 | user.permissions[RK][r_k] = p |
|
424 | user.permissions[RK][r_k] = p | |
425 |
|
425 | |||
426 | # default for repositories groups |
|
426 | # default for repositories groups | |
427 | for perm in default_repo_groups_perms: |
|
427 | for perm in default_repo_groups_perms: | |
428 | rg_k = perm.UserRepoGroupToPerm.group.group_name |
|
428 | rg_k = perm.UserRepoGroupToPerm.group.group_name | |
429 | p = perm.Permission.permission_name |
|
429 | p = perm.Permission.permission_name | |
430 | user.permissions[GK][rg_k] = p |
|
430 | user.permissions[GK][rg_k] = p | |
431 |
|
431 | |||
432 | #================================================================== |
|
432 | #================================================================== | |
433 | # overwrite default with user permissions if any |
|
433 | # overwrite default with user permissions if any | |
434 | #================================================================== |
|
434 | #================================================================== | |
435 |
|
435 | |||
436 | # user global |
|
436 | # user global | |
437 | user_perms = self.sa.query(UserToPerm)\ |
|
437 | user_perms = self.sa.query(UserToPerm)\ | |
438 | .options(joinedload(UserToPerm.permission))\ |
|
438 | .options(joinedload(UserToPerm.permission))\ | |
439 | .filter(UserToPerm.user_id == uid).all() |
|
439 | .filter(UserToPerm.user_id == uid).all() | |
440 |
|
440 | |||
441 | for perm in user_perms: |
|
441 | for perm in user_perms: | |
442 | user.permissions[GLOBAL].add(perm.permission.permission_name) |
|
442 | user.permissions[GLOBAL].add(perm.permission.permission_name) | |
443 |
|
443 | |||
444 | # user repositories |
|
444 | # user repositories | |
445 | user_repo_perms = \ |
|
445 | user_repo_perms = \ | |
446 | self.sa.query(UserRepoToPerm, Permission, Repository)\ |
|
446 | self.sa.query(UserRepoToPerm, Permission, Repository)\ | |
447 | .join((Repository, UserRepoToPerm.repository_id == Repository.repo_id))\ |
|
447 | .join((Repository, UserRepoToPerm.repository_id == Repository.repo_id))\ | |
448 | .join((Permission, UserRepoToPerm.permission_id == Permission.permission_id))\ |
|
448 | .join((Permission, UserRepoToPerm.permission_id == Permission.permission_id))\ | |
449 | .filter(UserRepoToPerm.user_id == uid)\ |
|
449 | .filter(UserRepoToPerm.user_id == uid)\ | |
450 | .all() |
|
450 | .all() | |
451 |
|
451 | |||
452 | for perm in user_repo_perms: |
|
452 | for perm in user_repo_perms: | |
453 | # set admin if owner |
|
453 | # set admin if owner | |
454 | r_k = perm.UserRepoToPerm.repository.repo_name |
|
454 | r_k = perm.UserRepoToPerm.repository.repo_name | |
455 | if perm.Repository.user_id == uid: |
|
455 | if perm.Repository.user_id == uid: | |
456 | p = 'repository.admin' |
|
456 | p = 'repository.admin' | |
457 | else: |
|
457 | else: | |
458 | p = perm.Permission.permission_name |
|
458 | p = perm.Permission.permission_name | |
459 | user.permissions[RK][r_k] = p |
|
459 | user.permissions[RK][r_k] = p | |
460 |
|
460 | |||
461 | #================================================================== |
|
461 | #================================================================== | |
462 | # check if user is part of groups for this repository and fill in |
|
462 | # check if user is part of groups for this repository and fill in | |
463 | # (or replace with higher) permissions |
|
463 | # (or replace with higher) permissions | |
464 | #================================================================== |
|
464 | #================================================================== | |
465 |
|
465 | |||
466 | # users group global |
|
466 | # users group global | |
467 | user_perms_from_users_groups = self.sa.query(UsersGroupToPerm)\ |
|
467 | user_perms_from_users_groups = self.sa.query(UsersGroupToPerm)\ | |
468 | .options(joinedload(UsersGroupToPerm.permission))\ |
|
468 | .options(joinedload(UsersGroupToPerm.permission))\ | |
469 | .join((UsersGroupMember, UsersGroupToPerm.users_group_id == |
|
469 | .join((UsersGroupMember, UsersGroupToPerm.users_group_id == | |
470 | UsersGroupMember.users_group_id))\ |
|
470 | UsersGroupMember.users_group_id))\ | |
471 | .filter(UsersGroupMember.user_id == uid).all() |
|
471 | .filter(UsersGroupMember.user_id == uid).all() | |
472 |
|
472 | |||
473 | for perm in user_perms_from_users_groups: |
|
473 | for perm in user_perms_from_users_groups: | |
474 | user.permissions[GLOBAL].add(perm.permission.permission_name) |
|
474 | user.permissions[GLOBAL].add(perm.permission.permission_name) | |
475 |
|
475 | |||
476 | # users group repositories |
|
476 | # users group repositories | |
477 | user_repo_perms_from_users_groups = \ |
|
477 | user_repo_perms_from_users_groups = \ | |
478 | self.sa.query(UsersGroupRepoToPerm, Permission, Repository,)\ |
|
478 | self.sa.query(UsersGroupRepoToPerm, Permission, Repository,)\ | |
479 | .join((Repository, UsersGroupRepoToPerm.repository_id == Repository.repo_id))\ |
|
479 | .join((Repository, UsersGroupRepoToPerm.repository_id == Repository.repo_id))\ | |
480 | .join((Permission, UsersGroupRepoToPerm.permission_id == Permission.permission_id))\ |
|
480 | .join((Permission, UsersGroupRepoToPerm.permission_id == Permission.permission_id))\ | |
481 | .join((UsersGroupMember, UsersGroupRepoToPerm.users_group_id == UsersGroupMember.users_group_id))\ |
|
481 | .join((UsersGroupMember, UsersGroupRepoToPerm.users_group_id == UsersGroupMember.users_group_id))\ | |
482 | .filter(UsersGroupMember.user_id == uid)\ |
|
482 | .filter(UsersGroupMember.user_id == uid)\ | |
483 | .all() |
|
483 | .all() | |
484 |
|
484 | |||
485 | for perm in user_repo_perms_from_users_groups: |
|
485 | for perm in user_repo_perms_from_users_groups: | |
486 | r_k = perm.UsersGroupRepoToPerm.repository.repo_name |
|
486 | r_k = perm.UsersGroupRepoToPerm.repository.repo_name | |
487 | p = perm.Permission.permission_name |
|
487 | p = perm.Permission.permission_name | |
488 | cur_perm = user.permissions[RK][r_k] |
|
488 | cur_perm = user.permissions[RK][r_k] | |
489 | # overwrite permission only if it's greater than permission |
|
489 | # overwrite permission only if it's greater than permission | |
490 | # given from other sources |
|
490 | # given from other sources | |
491 | if PERM_WEIGHTS[p] > PERM_WEIGHTS[cur_perm]: |
|
491 | if PERM_WEIGHTS[p] > PERM_WEIGHTS[cur_perm]: | |
492 | user.permissions[RK][r_k] = p |
|
492 | user.permissions[RK][r_k] = p | |
493 |
|
493 | |||
494 | #================================================================== |
|
494 | #================================================================== | |
495 | # get access for this user for repos group and override defaults |
|
495 | # get access for this user for repos group and override defaults | |
496 | #================================================================== |
|
496 | #================================================================== | |
497 |
|
497 | |||
498 | # user repositories groups |
|
498 | # user repositories groups | |
499 | user_repo_groups_perms = \ |
|
499 | user_repo_groups_perms = \ | |
500 | self.sa.query(UserRepoGroupToPerm, Permission, RepoGroup)\ |
|
500 | self.sa.query(UserRepoGroupToPerm, Permission, RepoGroup)\ | |
501 | .join((RepoGroup, UserRepoGroupToPerm.group_id == RepoGroup.group_id))\ |
|
501 | .join((RepoGroup, UserRepoGroupToPerm.group_id == RepoGroup.group_id))\ | |
502 | .join((Permission, UserRepoGroupToPerm.permission_id == Permission.permission_id))\ |
|
502 | .join((Permission, UserRepoGroupToPerm.permission_id == Permission.permission_id))\ | |
503 | .filter(UserRepoToPerm.user_id == uid)\ |
|
503 | .filter(UserRepoToPerm.user_id == uid)\ | |
504 | .all() |
|
504 | .all() | |
505 |
|
505 | |||
506 | for perm in user_repo_groups_perms: |
|
506 | for perm in user_repo_groups_perms: | |
507 | rg_k = perm.UserRepoGroupToPerm.group.group_name |
|
507 | rg_k = perm.UserRepoGroupToPerm.group.group_name | |
508 | p = perm.Permission.permission_name |
|
508 | p = perm.Permission.permission_name | |
509 | cur_perm = user.permissions[GK][rg_k] |
|
509 | cur_perm = user.permissions[GK][rg_k] | |
510 | if PERM_WEIGHTS[p] > PERM_WEIGHTS[cur_perm]: |
|
510 | if PERM_WEIGHTS[p] > PERM_WEIGHTS[cur_perm]: | |
511 | user.permissions[GK][rg_k] = p |
|
511 | user.permissions[GK][rg_k] = p | |
512 |
|
512 | |||
513 | return user |
|
513 | return user | |
514 |
|
514 | |||
515 | def has_perm(self, user, perm): |
|
515 | def has_perm(self, user, perm): | |
516 | if not isinstance(perm, Permission): |
|
516 | if not isinstance(perm, Permission): | |
517 | raise Exception('perm needs to be an instance of Permission class ' |
|
517 | raise Exception('perm needs to be an instance of Permission class ' | |
518 | 'got %s instead' % type(perm)) |
|
518 | 'got %s instead' % type(perm)) | |
519 |
|
519 | |||
520 | user = self.__get_user(user) |
|
520 | user = self.__get_user(user) | |
521 |
|
521 | |||
522 | return UserToPerm.query().filter(UserToPerm.user == user)\ |
|
522 | return UserToPerm.query().filter(UserToPerm.user == user)\ | |
523 | .filter(UserToPerm.permission == perm).scalar() is not None |
|
523 | .filter(UserToPerm.permission == perm).scalar() is not None | |
524 |
|
524 | |||
525 | def grant_perm(self, user, perm): |
|
525 | def grant_perm(self, user, perm): | |
526 | """ |
|
526 | """ | |
527 | Grant user global permissions |
|
527 | Grant user global permissions | |
528 |
|
528 | |||
529 | :param user: |
|
529 | :param user: | |
530 | :param perm: |
|
530 | :param perm: | |
531 | """ |
|
531 | """ | |
532 | user = self.__get_user(user) |
|
532 | user = self.__get_user(user) | |
533 | perm = self.__get_perm(perm) |
|
533 | perm = self.__get_perm(perm) | |
|
534 | # if this permission is already granted skip it | |||
|
535 | _perm = UserToPerm.query()\ | |||
|
536 | .filter(UserToPerm.user == user)\ | |||
|
537 | .filter(UserToPerm.permission == perm)\ | |||
|
538 | .scalar() | |||
|
539 | if _perm: | |||
|
540 | return | |||
534 | new = UserToPerm() |
|
541 | new = UserToPerm() | |
535 | new.user = user |
|
542 | new.user = user | |
536 | new.permission = perm |
|
543 | new.permission = perm | |
537 | self.sa.add(new) |
|
544 | self.sa.add(new) | |
538 |
|
545 | |||
539 | def revoke_perm(self, user, perm): |
|
546 | def revoke_perm(self, user, perm): | |
540 | """ |
|
547 | """ | |
541 | Revoke users global permissions |
|
548 | Revoke users global permissions | |
542 |
|
549 | |||
543 | :param user: |
|
550 | :param user: | |
544 | :param perm: |
|
551 | :param perm: | |
545 | """ |
|
552 | """ | |
546 | user = self.__get_user(user) |
|
553 | user = self.__get_user(user) | |
547 | perm = self.__get_perm(perm) |
|
554 | perm = self.__get_perm(perm) | |
548 |
|
555 | |||
549 |
obj = UserToPerm.query() |
|
556 | obj = UserToPerm.query()\ | |
550 |
.filter(UserToPerm. |
|
557 | .filter(UserToPerm.user == user)\ | |
|
558 | .filter(UserToPerm.permission == perm)\ | |||
|
559 | .scalar() | |||
551 | if obj: |
|
560 | if obj: | |
552 | self.sa.delete(obj) |
|
561 | self.sa.delete(obj) |
@@ -1,188 +1,196 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | """ |
|
2 | """ | |
3 | rhodecode.model.users_group |
|
3 | rhodecode.model.users_group | |
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
5 |
|
5 | |||
6 | users group model for RhodeCode |
|
6 | users group model for RhodeCode | |
7 |
|
7 | |||
8 | :created_on: Oct 1, 2011 |
|
8 | :created_on: Oct 1, 2011 | |
9 | :author: nvinot |
|
9 | :author: nvinot | |
10 | :copyright: (C) 2011-2011 Nicolas Vinot <aeris@imirhil.fr> |
|
10 | :copyright: (C) 2011-2011 Nicolas Vinot <aeris@imirhil.fr> | |
11 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> |
|
11 | :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com> | |
12 | :license: GPLv3, see COPYING for more details. |
|
12 | :license: GPLv3, see COPYING for more details. | |
13 | """ |
|
13 | """ | |
14 | # This program is free software: you can redistribute it and/or modify |
|
14 | # This program is free software: you can redistribute it and/or modify | |
15 | # it under the terms of the GNU General Public License as published by |
|
15 | # it under the terms of the GNU General Public License as published by | |
16 | # the Free Software Foundation, either version 3 of the License, or |
|
16 | # the Free Software Foundation, either version 3 of the License, or | |
17 | # (at your option) any later version. |
|
17 | # (at your option) any later version. | |
18 | # |
|
18 | # | |
19 | # This program is distributed in the hope that it will be useful, |
|
19 | # This program is distributed in the hope that it will be useful, | |
20 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
20 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
21 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
21 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
22 | # GNU General Public License for more details. |
|
22 | # GNU General Public License for more details. | |
23 | # |
|
23 | # | |
24 | # You should have received a copy of the GNU General Public License |
|
24 | # You should have received a copy of the GNU General Public License | |
25 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
25 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
26 |
|
26 | |||
27 | import logging |
|
27 | import logging | |
28 | import traceback |
|
28 | import traceback | |
29 |
|
29 | |||
30 | from rhodecode.model import BaseModel |
|
30 | from rhodecode.model import BaseModel | |
31 | from rhodecode.model.db import UsersGroupMember, UsersGroup,\ |
|
31 | from rhodecode.model.db import UsersGroupMember, UsersGroup,\ | |
32 | UsersGroupRepoToPerm, Permission, UsersGroupToPerm, User |
|
32 | UsersGroupRepoToPerm, Permission, UsersGroupToPerm, User | |
33 | from rhodecode.lib.exceptions import UsersGroupsAssignedException |
|
33 | from rhodecode.lib.exceptions import UsersGroupsAssignedException | |
34 |
|
34 | |||
35 | log = logging.getLogger(__name__) |
|
35 | log = logging.getLogger(__name__) | |
36 |
|
36 | |||
37 |
|
37 | |||
38 | class UsersGroupModel(BaseModel): |
|
38 | class UsersGroupModel(BaseModel): | |
39 |
|
39 | |||
40 | def __get_user(self, user): |
|
40 | def __get_user(self, user): | |
41 | return self._get_instance(User, user, callback=User.get_by_username) |
|
41 | return self._get_instance(User, user, callback=User.get_by_username) | |
42 |
|
42 | |||
43 | def __get_users_group(self, users_group): |
|
43 | def __get_users_group(self, users_group): | |
44 | return self._get_instance(UsersGroup, users_group, |
|
44 | return self._get_instance(UsersGroup, users_group, | |
45 | callback=UsersGroup.get_by_group_name) |
|
45 | callback=UsersGroup.get_by_group_name) | |
46 |
|
46 | |||
47 | def __get_perm(self, permission): |
|
47 | def __get_perm(self, permission): | |
48 | return self._get_instance(Permission, permission, |
|
48 | return self._get_instance(Permission, permission, | |
49 | callback=Permission.get_by_key) |
|
49 | callback=Permission.get_by_key) | |
50 |
|
50 | |||
51 | def get(self, users_group_id, cache=False): |
|
51 | def get(self, users_group_id, cache=False): | |
52 | return UsersGroup.get(users_group_id) |
|
52 | return UsersGroup.get(users_group_id) | |
53 |
|
53 | |||
54 | def get_by_name(self, name, cache=False, case_insensitive=False): |
|
54 | def get_by_name(self, name, cache=False, case_insensitive=False): | |
55 | return UsersGroup.get_by_group_name(name, cache, case_insensitive) |
|
55 | return UsersGroup.get_by_group_name(name, cache, case_insensitive) | |
56 |
|
56 | |||
57 | def create(self, name, active=True): |
|
57 | def create(self, name, active=True): | |
58 | try: |
|
58 | try: | |
59 | new = UsersGroup() |
|
59 | new = UsersGroup() | |
60 | new.users_group_name = name |
|
60 | new.users_group_name = name | |
61 | new.users_group_active = active |
|
61 | new.users_group_active = active | |
62 | self.sa.add(new) |
|
62 | self.sa.add(new) | |
63 | return new |
|
63 | return new | |
64 | except: |
|
64 | except: | |
65 | log.error(traceback.format_exc()) |
|
65 | log.error(traceback.format_exc()) | |
66 | raise |
|
66 | raise | |
67 |
|
67 | |||
68 | def update(self, users_group, form_data): |
|
68 | def update(self, users_group, form_data): | |
69 |
|
69 | |||
70 | try: |
|
70 | try: | |
71 | users_group = self.__get_users_group(users_group) |
|
71 | users_group = self.__get_users_group(users_group) | |
72 |
|
72 | |||
73 | for k, v in form_data.items(): |
|
73 | for k, v in form_data.items(): | |
74 | if k == 'users_group_members': |
|
74 | if k == 'users_group_members': | |
75 | users_group.members = [] |
|
75 | users_group.members = [] | |
76 | self.sa.flush() |
|
76 | self.sa.flush() | |
77 | members_list = [] |
|
77 | members_list = [] | |
78 | if v: |
|
78 | if v: | |
79 | v = [v] if isinstance(v, basestring) else v |
|
79 | v = [v] if isinstance(v, basestring) else v | |
80 | for u_id in set(v): |
|
80 | for u_id in set(v): | |
81 | member = UsersGroupMember(users_group.users_group_id, u_id) |
|
81 | member = UsersGroupMember(users_group.users_group_id, u_id) | |
82 | members_list.append(member) |
|
82 | members_list.append(member) | |
83 | setattr(users_group, 'members', members_list) |
|
83 | setattr(users_group, 'members', members_list) | |
84 | setattr(users_group, k, v) |
|
84 | setattr(users_group, k, v) | |
85 |
|
85 | |||
86 | self.sa.add(users_group) |
|
86 | self.sa.add(users_group) | |
87 | except: |
|
87 | except: | |
88 | log.error(traceback.format_exc()) |
|
88 | log.error(traceback.format_exc()) | |
89 | raise |
|
89 | raise | |
90 |
|
90 | |||
91 | def delete(self, users_group, force=False): |
|
91 | def delete(self, users_group, force=False): | |
92 | """ |
|
92 | """ | |
93 | Deletes repos group, unless force flag is used |
|
93 | Deletes repos group, unless force flag is used | |
94 | raises exception if there are members in that group, else deletes |
|
94 | raises exception if there are members in that group, else deletes | |
95 | group and users |
|
95 | group and users | |
96 |
|
96 | |||
97 | :param users_group: |
|
97 | :param users_group: | |
98 | :param force: |
|
98 | :param force: | |
99 | """ |
|
99 | """ | |
100 | try: |
|
100 | try: | |
101 | users_group = self.__get_users_group(users_group) |
|
101 | users_group = self.__get_users_group(users_group) | |
102 |
|
102 | |||
103 | # check if this group is not assigned to repo |
|
103 | # check if this group is not assigned to repo | |
104 | assigned_groups = UsersGroupRepoToPerm.query()\ |
|
104 | assigned_groups = UsersGroupRepoToPerm.query()\ | |
105 | .filter(UsersGroupRepoToPerm.users_group == users_group).all() |
|
105 | .filter(UsersGroupRepoToPerm.users_group == users_group).all() | |
106 |
|
106 | |||
107 | if assigned_groups and force is False: |
|
107 | if assigned_groups and force is False: | |
108 | raise UsersGroupsAssignedException('RepoGroup assigned to %s' % |
|
108 | raise UsersGroupsAssignedException('RepoGroup assigned to %s' % | |
109 | assigned_groups) |
|
109 | assigned_groups) | |
110 |
|
110 | |||
111 | self.sa.delete(users_group) |
|
111 | self.sa.delete(users_group) | |
112 | except: |
|
112 | except: | |
113 | log.error(traceback.format_exc()) |
|
113 | log.error(traceback.format_exc()) | |
114 | raise |
|
114 | raise | |
115 |
|
115 | |||
116 | def add_user_to_group(self, users_group, user): |
|
116 | def add_user_to_group(self, users_group, user): | |
117 | users_group = self.__get_users_group(users_group) |
|
117 | users_group = self.__get_users_group(users_group) | |
118 | user = self.__get_user(user) |
|
118 | user = self.__get_user(user) | |
119 |
|
119 | |||
120 | for m in users_group.members: |
|
120 | for m in users_group.members: | |
121 | u = m.user |
|
121 | u = m.user | |
122 | if u.user_id == user.user_id: |
|
122 | if u.user_id == user.user_id: | |
123 | return True |
|
123 | return True | |
124 |
|
124 | |||
125 | try: |
|
125 | try: | |
126 | users_group_member = UsersGroupMember() |
|
126 | users_group_member = UsersGroupMember() | |
127 | users_group_member.user = user |
|
127 | users_group_member.user = user | |
128 | users_group_member.users_group = users_group |
|
128 | users_group_member.users_group = users_group | |
129 |
|
129 | |||
130 | users_group.members.append(users_group_member) |
|
130 | users_group.members.append(users_group_member) | |
131 | user.group_member.append(users_group_member) |
|
131 | user.group_member.append(users_group_member) | |
132 |
|
132 | |||
133 | self.sa.add(users_group_member) |
|
133 | self.sa.add(users_group_member) | |
134 | return users_group_member |
|
134 | return users_group_member | |
135 | except: |
|
135 | except: | |
136 | log.error(traceback.format_exc()) |
|
136 | log.error(traceback.format_exc()) | |
137 | raise |
|
137 | raise | |
138 |
|
138 | |||
139 | def remove_user_from_group(self, users_group, user): |
|
139 | def remove_user_from_group(self, users_group, user): | |
140 | users_group = self.__get_users_group(users_group) |
|
140 | users_group = self.__get_users_group(users_group) | |
141 | user = self.__get_user(user) |
|
141 | user = self.__get_user(user) | |
142 |
|
142 | |||
143 | users_group_member = None |
|
143 | users_group_member = None | |
144 | for m in users_group.members: |
|
144 | for m in users_group.members: | |
145 | if m.user.user_id == user.user_id: |
|
145 | if m.user.user_id == user.user_id: | |
146 | # Found this user's membership row |
|
146 | # Found this user's membership row | |
147 | users_group_member = m |
|
147 | users_group_member = m | |
148 | break |
|
148 | break | |
149 |
|
149 | |||
150 | if users_group_member: |
|
150 | if users_group_member: | |
151 | try: |
|
151 | try: | |
152 | self.sa.delete(users_group_member) |
|
152 | self.sa.delete(users_group_member) | |
153 | return True |
|
153 | return True | |
154 | except: |
|
154 | except: | |
155 | log.error(traceback.format_exc()) |
|
155 | log.error(traceback.format_exc()) | |
156 | raise |
|
156 | raise | |
157 | else: |
|
157 | else: | |
158 | # User isn't in that group |
|
158 | # User isn't in that group | |
159 | return False |
|
159 | return False | |
160 |
|
160 | |||
161 | def has_perm(self, users_group, perm): |
|
161 | def has_perm(self, users_group, perm): | |
162 | users_group = self.__get_users_group(users_group) |
|
162 | users_group = self.__get_users_group(users_group) | |
163 | perm = self.__get_perm(perm) |
|
163 | perm = self.__get_perm(perm) | |
164 |
|
164 | |||
165 | return UsersGroupToPerm.query()\ |
|
165 | return UsersGroupToPerm.query()\ | |
166 | .filter(UsersGroupToPerm.users_group == users_group)\ |
|
166 | .filter(UsersGroupToPerm.users_group == users_group)\ | |
167 | .filter(UsersGroupToPerm.permission == perm).scalar() is not None |
|
167 | .filter(UsersGroupToPerm.permission == perm).scalar() is not None | |
168 |
|
168 | |||
169 | def grant_perm(self, users_group, perm): |
|
169 | def grant_perm(self, users_group, perm): | |
170 | if not isinstance(perm, Permission): |
|
170 | if not isinstance(perm, Permission): | |
171 | raise Exception('perm needs to be an instance of Permission class') |
|
171 | raise Exception('perm needs to be an instance of Permission class') | |
172 |
|
172 | |||
173 | users_group = self.__get_users_group(users_group) |
|
173 | users_group = self.__get_users_group(users_group) | |
174 |
|
174 | |||
|
175 | # if this permission is already granted skip it | |||
|
176 | _perm = UsersGroupToPerm.query()\ | |||
|
177 | .filter(UsersGroupToPerm.users_group == users_group)\ | |||
|
178 | .filter(UsersGroupToPerm.permission == perm)\ | |||
|
179 | .scalar() | |||
|
180 | if _perm: | |||
|
181 | return | |||
|
182 | ||||
175 | new = UsersGroupToPerm() |
|
183 | new = UsersGroupToPerm() | |
176 | new.users_group = users_group |
|
184 | new.users_group = users_group | |
177 | new.permission = perm |
|
185 | new.permission = perm | |
178 | self.sa.add(new) |
|
186 | self.sa.add(new) | |
179 |
|
187 | |||
180 | def revoke_perm(self, users_group, perm): |
|
188 | def revoke_perm(self, users_group, perm): | |
181 | users_group = self.__get_users_group(users_group) |
|
189 | users_group = self.__get_users_group(users_group) | |
182 | perm = self.__get_perm(perm) |
|
190 | perm = self.__get_perm(perm) | |
183 |
|
191 | |||
184 | obj = UsersGroupToPerm.query()\ |
|
192 | obj = UsersGroupToPerm.query()\ | |
185 | .filter(UsersGroupToPerm.users_group == users_group)\ |
|
193 | .filter(UsersGroupToPerm.users_group == users_group)\ | |
186 | .filter(UsersGroupToPerm.permission == perm).scalar() |
|
194 | .filter(UsersGroupToPerm.permission == perm).scalar() | |
187 | if obj: |
|
195 | if obj: | |
188 | self.sa.delete(obj) |
|
196 | self.sa.delete(obj) |
@@ -1,722 +1,726 b'' | |||||
1 | /** |
|
1 | /** | |
2 | RhodeCode JS Files |
|
2 | RhodeCode JS Files | |
3 | **/ |
|
3 | **/ | |
4 |
|
4 | |||
5 | if (typeof console == "undefined" || typeof console.log == "undefined"){ |
|
5 | if (typeof console == "undefined" || typeof console.log == "undefined"){ | |
6 | console = { log: function() {} } |
|
6 | console = { log: function() {} } | |
7 | } |
|
7 | } | |
8 |
|
8 | |||
9 |
|
9 | |||
10 | var str_repeat = function(i, m) { |
|
10 | var str_repeat = function(i, m) { | |
11 | for (var o = []; m > 0; o[--m] = i); |
|
11 | for (var o = []; m > 0; o[--m] = i); | |
12 | return o.join(''); |
|
12 | return o.join(''); | |
13 | }; |
|
13 | }; | |
14 |
|
14 | |||
15 | /** |
|
15 | /** | |
16 | * INJECT .format function into String |
|
16 | * INJECT .format function into String | |
17 | * Usage: "My name is {0} {1}".format("Johny","Bravo") |
|
17 | * Usage: "My name is {0} {1}".format("Johny","Bravo") | |
18 | * Return "My name is Johny Bravo" |
|
18 | * Return "My name is Johny Bravo" | |
19 | * Inspired by https://gist.github.com/1049426 |
|
19 | * Inspired by https://gist.github.com/1049426 | |
20 | */ |
|
20 | */ | |
21 | String.prototype.format = function() { |
|
21 | String.prototype.format = function() { | |
22 |
|
22 | |||
23 | function format() { |
|
23 | function format() { | |
24 | var str = this; |
|
24 | var str = this; | |
25 | var len = arguments.length+1; |
|
25 | var len = arguments.length+1; | |
26 | var safe = undefined; |
|
26 | var safe = undefined; | |
27 | var arg = undefined; |
|
27 | var arg = undefined; | |
28 |
|
28 | |||
29 | // For each {0} {1} {n...} replace with the argument in that position. If |
|
29 | // For each {0} {1} {n...} replace with the argument in that position. If | |
30 | // the argument is an object or an array it will be stringified to JSON. |
|
30 | // the argument is an object or an array it will be stringified to JSON. | |
31 | for (var i=0; i < len; arg = arguments[i++]) { |
|
31 | for (var i=0; i < len; arg = arguments[i++]) { | |
32 | safe = typeof arg === 'object' ? JSON.stringify(arg) : arg; |
|
32 | safe = typeof arg === 'object' ? JSON.stringify(arg) : arg; | |
33 | str = str.replace(RegExp('\\{'+(i-1)+'\\}', 'g'), safe); |
|
33 | str = str.replace(RegExp('\\{'+(i-1)+'\\}', 'g'), safe); | |
34 | } |
|
34 | } | |
35 | return str; |
|
35 | return str; | |
36 | } |
|
36 | } | |
37 |
|
37 | |||
38 | // Save a reference of what may already exist under the property native. |
|
38 | // Save a reference of what may already exist under the property native. | |
39 | // Allows for doing something like: if("".format.native) { /* use native */ } |
|
39 | // Allows for doing something like: if("".format.native) { /* use native */ } | |
40 | format.native = String.prototype.format; |
|
40 | format.native = String.prototype.format; | |
41 |
|
41 | |||
42 | // Replace the prototype property |
|
42 | // Replace the prototype property | |
43 | return format; |
|
43 | return format; | |
44 |
|
44 | |||
45 | }(); |
|
45 | }(); | |
46 |
|
46 | |||
47 |
|
47 | |||
48 | /** |
|
48 | /** | |
49 | * SmartColorGenerator |
|
49 | * SmartColorGenerator | |
50 | * |
|
50 | * | |
51 | *usage:: |
|
51 | *usage:: | |
52 | * var CG = new ColorGenerator(); |
|
52 | * var CG = new ColorGenerator(); | |
53 | * var col = CG.getColor(key); //returns array of RGB |
|
53 | * var col = CG.getColor(key); //returns array of RGB | |
54 | * 'rgb({0})'.format(col.join(',') |
|
54 | * 'rgb({0})'.format(col.join(',') | |
55 | * |
|
55 | * | |
56 | * @returns {ColorGenerator} |
|
56 | * @returns {ColorGenerator} | |
57 | */ |
|
57 | */ | |
58 | var ColorGenerator = function(){ |
|
58 | var ColorGenerator = function(){ | |
59 | this.GOLDEN_RATIO = 0.618033988749895; |
|
59 | this.GOLDEN_RATIO = 0.618033988749895; | |
60 | this.CURRENT_RATIO = 0.22717784590367374 // this can be random |
|
60 | this.CURRENT_RATIO = 0.22717784590367374 // this can be random | |
61 | this.HSV_1 = 0.75;//saturation |
|
61 | this.HSV_1 = 0.75;//saturation | |
62 | this.HSV_2 = 0.95; |
|
62 | this.HSV_2 = 0.95; | |
63 | this.color; |
|
63 | this.color; | |
64 | this.cacheColorMap = {}; |
|
64 | this.cacheColorMap = {}; | |
65 | }; |
|
65 | }; | |
66 |
|
66 | |||
67 | ColorGenerator.prototype = { |
|
67 | ColorGenerator.prototype = { | |
68 | getColor:function(key){ |
|
68 | getColor:function(key){ | |
69 | if(this.cacheColorMap[key] !== undefined){ |
|
69 | if(this.cacheColorMap[key] !== undefined){ | |
70 | return this.cacheColorMap[key]; |
|
70 | return this.cacheColorMap[key]; | |
71 | } |
|
71 | } | |
72 | else{ |
|
72 | else{ | |
73 | this.cacheColorMap[key] = this.generateColor(); |
|
73 | this.cacheColorMap[key] = this.generateColor(); | |
74 | return this.cacheColorMap[key]; |
|
74 | return this.cacheColorMap[key]; | |
75 | } |
|
75 | } | |
76 | }, |
|
76 | }, | |
77 | _hsvToRgb:function(h,s,v){ |
|
77 | _hsvToRgb:function(h,s,v){ | |
78 | if (s == 0.0) |
|
78 | if (s == 0.0) | |
79 | return [v, v, v]; |
|
79 | return [v, v, v]; | |
80 | i = parseInt(h * 6.0) |
|
80 | i = parseInt(h * 6.0) | |
81 | f = (h * 6.0) - i |
|
81 | f = (h * 6.0) - i | |
82 | p = v * (1.0 - s) |
|
82 | p = v * (1.0 - s) | |
83 | q = v * (1.0 - s * f) |
|
83 | q = v * (1.0 - s * f) | |
84 | t = v * (1.0 - s * (1.0 - f)) |
|
84 | t = v * (1.0 - s * (1.0 - f)) | |
85 | i = i % 6 |
|
85 | i = i % 6 | |
86 | if (i == 0) |
|
86 | if (i == 0) | |
87 | return [v, t, p] |
|
87 | return [v, t, p] | |
88 | if (i == 1) |
|
88 | if (i == 1) | |
89 | return [q, v, p] |
|
89 | return [q, v, p] | |
90 | if (i == 2) |
|
90 | if (i == 2) | |
91 | return [p, v, t] |
|
91 | return [p, v, t] | |
92 | if (i == 3) |
|
92 | if (i == 3) | |
93 | return [p, q, v] |
|
93 | return [p, q, v] | |
94 | if (i == 4) |
|
94 | if (i == 4) | |
95 | return [t, p, v] |
|
95 | return [t, p, v] | |
96 | if (i == 5) |
|
96 | if (i == 5) | |
97 | return [v, p, q] |
|
97 | return [v, p, q] | |
98 | }, |
|
98 | }, | |
99 | generateColor:function(){ |
|
99 | generateColor:function(){ | |
100 | this.CURRENT_RATIO = this.CURRENT_RATIO+this.GOLDEN_RATIO; |
|
100 | this.CURRENT_RATIO = this.CURRENT_RATIO+this.GOLDEN_RATIO; | |
101 | this.CURRENT_RATIO = this.CURRENT_RATIO %= 1; |
|
101 | this.CURRENT_RATIO = this.CURRENT_RATIO %= 1; | |
102 | HSV_tuple = [this.CURRENT_RATIO, this.HSV_1, this.HSV_2] |
|
102 | HSV_tuple = [this.CURRENT_RATIO, this.HSV_1, this.HSV_2] | |
103 | RGB_tuple = this._hsvToRgb(HSV_tuple[0],HSV_tuple[1],HSV_tuple[2]); |
|
103 | RGB_tuple = this._hsvToRgb(HSV_tuple[0],HSV_tuple[1],HSV_tuple[2]); | |
104 | function toRgb(v){ |
|
104 | function toRgb(v){ | |
105 | return ""+parseInt(v*256) |
|
105 | return ""+parseInt(v*256) | |
106 | } |
|
106 | } | |
107 | return [toRgb(RGB_tuple[0]),toRgb(RGB_tuple[1]),toRgb(RGB_tuple[2])]; |
|
107 | return [toRgb(RGB_tuple[0]),toRgb(RGB_tuple[1]),toRgb(RGB_tuple[2])]; | |
108 |
|
108 | |||
109 | } |
|
109 | } | |
110 | } |
|
110 | } | |
111 |
|
111 | |||
112 |
|
112 | |||
113 |
|
113 | |||
114 |
|
114 | |||
115 |
|
115 | |||
116 | /** |
|
116 | /** | |
117 | * GLOBAL YUI Shortcuts |
|
117 | * GLOBAL YUI Shortcuts | |
118 | */ |
|
118 | */ | |
119 | var YUC = YAHOO.util.Connect; |
|
119 | var YUC = YAHOO.util.Connect; | |
120 | var YUD = YAHOO.util.Dom; |
|
120 | var YUD = YAHOO.util.Dom; | |
121 | var YUE = YAHOO.util.Event; |
|
121 | var YUE = YAHOO.util.Event; | |
122 | var YUQ = YAHOO.util.Selector.query; |
|
122 | var YUQ = YAHOO.util.Selector.query; | |
123 |
|
123 | |||
124 | // defines if push state is enabled for this browser ? |
|
124 | // defines if push state is enabled for this browser ? | |
125 | var push_state_enabled = Boolean( |
|
125 | var push_state_enabled = Boolean( | |
126 | window.history && window.history.pushState && window.history.replaceState |
|
126 | window.history && window.history.pushState && window.history.replaceState | |
127 | && !( /* disable for versions of iOS before version 4.3 (8F190) */ |
|
127 | && !( /* disable for versions of iOS before version 4.3 (8F190) */ | |
128 | (/ Mobile\/([1-7][a-z]|(8([abcde]|f(1[0-8]))))/i).test(navigator.userAgent) |
|
128 | (/ Mobile\/([1-7][a-z]|(8([abcde]|f(1[0-8]))))/i).test(navigator.userAgent) | |
129 | /* disable for the mercury iOS browser, or at least older versions of the webkit engine */ |
|
129 | /* disable for the mercury iOS browser, or at least older versions of the webkit engine */ | |
130 | || (/AppleWebKit\/5([0-2]|3[0-2])/i).test(navigator.userAgent) |
|
130 | || (/AppleWebKit\/5([0-2]|3[0-2])/i).test(navigator.userAgent) | |
131 | ) |
|
131 | ) | |
132 | ); |
|
132 | ); | |
133 |
|
133 | |||
134 | var _run_callbacks = function(callbacks){ |
|
134 | var _run_callbacks = function(callbacks){ | |
135 | if (callbacks !== undefined){ |
|
135 | if (callbacks !== undefined){ | |
136 | var _l = callbacks.length; |
|
136 | var _l = callbacks.length; | |
137 | for (var i=0;i<_l;i++){ |
|
137 | for (var i=0;i<_l;i++){ | |
138 | var func = callbacks[i]; |
|
138 | var func = callbacks[i]; | |
139 | if(typeof(func)=='function'){ |
|
139 | if(typeof(func)=='function'){ | |
140 | try{ |
|
140 | try{ | |
141 | func(); |
|
141 | func(); | |
142 | }catch (err){}; |
|
142 | }catch (err){}; | |
143 | } |
|
143 | } | |
144 | } |
|
144 | } | |
145 | } |
|
145 | } | |
146 | } |
|
146 | } | |
147 |
|
147 | |||
148 | /** |
|
148 | /** | |
149 | * Partial Ajax Implementation |
|
149 | * Partial Ajax Implementation | |
150 | * |
|
150 | * | |
151 | * @param url: defines url to make partial request |
|
151 | * @param url: defines url to make partial request | |
152 | * @param container: defines id of container to input partial result |
|
152 | * @param container: defines id of container to input partial result | |
153 | * @param s_call: success callback function that takes o as arg |
|
153 | * @param s_call: success callback function that takes o as arg | |
154 | * o.tId |
|
154 | * o.tId | |
155 | * o.status |
|
155 | * o.status | |
156 | * o.statusText |
|
156 | * o.statusText | |
157 | * o.getResponseHeader[ ] |
|
157 | * o.getResponseHeader[ ] | |
158 | * o.getAllResponseHeaders |
|
158 | * o.getAllResponseHeaders | |
159 | * o.responseText |
|
159 | * o.responseText | |
160 | * o.responseXML |
|
160 | * o.responseXML | |
161 | * o.argument |
|
161 | * o.argument | |
162 | * @param f_call: failure callback |
|
162 | * @param f_call: failure callback | |
163 | * @param args arguments |
|
163 | * @param args arguments | |
164 | */ |
|
164 | */ | |
165 | function ypjax(url,container,s_call,f_call,args){ |
|
165 | function ypjax(url,container,s_call,f_call,args){ | |
166 | var method='GET'; |
|
166 | var method='GET'; | |
167 | if(args===undefined){ |
|
167 | if(args===undefined){ | |
168 | args=null; |
|
168 | args=null; | |
169 | } |
|
169 | } | |
170 |
|
170 | |||
171 | // Set special header for partial ajax == HTTP_X_PARTIAL_XHR |
|
171 | // Set special header for partial ajax == HTTP_X_PARTIAL_XHR | |
172 | YUC.initHeader('X-PARTIAL-XHR',true); |
|
172 | YUC.initHeader('X-PARTIAL-XHR',true); | |
173 |
|
173 | |||
174 | // wrapper of passed callback |
|
174 | // wrapper of passed callback | |
175 | var s_wrapper = (function(o){ |
|
175 | var s_wrapper = (function(o){ | |
176 | return function(o){ |
|
176 | return function(o){ | |
177 | YUD.get(container).innerHTML=o.responseText; |
|
177 | YUD.get(container).innerHTML=o.responseText; | |
178 | YUD.setStyle(container,'opacity','1.0'); |
|
178 | YUD.setStyle(container,'opacity','1.0'); | |
179 | //execute the given original callback |
|
179 | //execute the given original callback | |
180 | if (s_call !== undefined){ |
|
180 | if (s_call !== undefined){ | |
181 | s_call(o); |
|
181 | s_call(o); | |
182 | } |
|
182 | } | |
183 | } |
|
183 | } | |
184 | })() |
|
184 | })() | |
185 | YUD.setStyle(container,'opacity','0.3'); |
|
185 | YUD.setStyle(container,'opacity','0.3'); | |
186 | YUC.asyncRequest(method,url,{ |
|
186 | YUC.asyncRequest(method,url,{ | |
187 | success:s_wrapper, |
|
187 | success:s_wrapper, | |
188 | failure:function(o){ |
|
188 | failure:function(o){ | |
189 | console.log(o); |
|
189 | console.log(o); | |
190 | YUD.get(container).innerHTML='ERROR'; |
|
190 | YUD.get(container).innerHTML='ERROR'; | |
191 | YUD.setStyle(container,'opacity','1.0'); |
|
191 | YUD.setStyle(container,'opacity','1.0'); | |
192 | YUD.setStyle(container,'color','red'); |
|
192 | YUD.setStyle(container,'color','red'); | |
193 | } |
|
193 | } | |
194 | },args); |
|
194 | },args); | |
195 |
|
195 | |||
196 | }; |
|
196 | }; | |
197 |
|
197 | |||
198 | /** |
|
198 | /** | |
199 | * tooltip activate |
|
199 | * tooltip activate | |
200 | */ |
|
200 | */ | |
201 | var tooltip_activate = function(){ |
|
201 | var tooltip_activate = function(){ | |
202 | function toolTipsId(){ |
|
202 | function toolTipsId(){ | |
203 | var ids = []; |
|
203 | var ids = []; | |
204 | var tts = YUQ('.tooltip'); |
|
204 | var tts = YUQ('.tooltip'); | |
205 | for (var i = 0; i < tts.length; i++) { |
|
205 | for (var i = 0; i < tts.length; i++) { | |
206 | // if element doesn't not have and id |
|
206 | // if element doesn't not have and id | |
207 | // autogenerate one for tooltip |
|
207 | // autogenerate one for tooltip | |
208 | if (!tts[i].id){ |
|
208 | if (!tts[i].id){ | |
209 | tts[i].id='tt'+((i*100)+tts.length); |
|
209 | tts[i].id='tt'+((i*100)+tts.length); | |
210 | } |
|
210 | } | |
211 | ids.push(tts[i].id); |
|
211 | ids.push(tts[i].id); | |
212 | } |
|
212 | } | |
213 | return ids |
|
213 | return ids | |
214 | }; |
|
214 | }; | |
215 | var myToolTips = new YAHOO.widget.Tooltip("tooltip", { |
|
215 | var myToolTips = new YAHOO.widget.Tooltip("tooltip", { | |
216 | context: [[toolTipsId()],"tl","bl",null,[0,5]], |
|
216 | context: [[toolTipsId()],"tl","bl",null,[0,5]], | |
217 | monitorresize:false, |
|
217 | monitorresize:false, | |
218 | xyoffset :[0,0], |
|
218 | xyoffset :[0,0], | |
219 | autodismissdelay:300000, |
|
219 | autodismissdelay:300000, | |
220 | hidedelay:5, |
|
220 | hidedelay:5, | |
221 | showdelay:20, |
|
221 | showdelay:20, | |
222 | }); |
|
222 | }); | |
223 | }; |
|
223 | }; | |
224 |
|
224 | |||
225 | /** |
|
225 | /** | |
226 | * show more |
|
226 | * show more | |
227 | */ |
|
227 | */ | |
228 | var show_more_event = function(){ |
|
228 | var show_more_event = function(){ | |
229 | YUE.on(YUD.getElementsByClassName('show_more'),'click',function(e){ |
|
229 | YUE.on(YUD.getElementsByClassName('show_more'),'click',function(e){ | |
230 | var el = e.target; |
|
230 | var el = e.target; | |
231 | YUD.setStyle(YUD.get(el.id.substring(1)),'display',''); |
|
231 | YUD.setStyle(YUD.get(el.id.substring(1)),'display',''); | |
232 | YUD.setStyle(el.parentNode,'display','none'); |
|
232 | YUD.setStyle(el.parentNode,'display','none'); | |
233 | }); |
|
233 | }); | |
234 | }; |
|
234 | }; | |
235 |
|
235 | |||
236 |
|
236 | |||
237 | /** |
|
237 | /** | |
238 | * Quick filter widget |
|
238 | * Quick filter widget | |
239 | * |
|
239 | * | |
240 | * @param target: filter input target |
|
240 | * @param target: filter input target | |
241 | * @param nodes: list of nodes in html we want to filter. |
|
241 | * @param nodes: list of nodes in html we want to filter. | |
242 | * @param display_element function that takes current node from nodes and |
|
242 | * @param display_element function that takes current node from nodes and | |
243 | * does hide or show based on the node |
|
243 | * does hide or show based on the node | |
244 | * |
|
244 | * | |
245 | */ |
|
245 | */ | |
246 | var q_filter = function(target,nodes,display_element){ |
|
246 | var q_filter = function(target,nodes,display_element){ | |
247 |
|
247 | |||
248 | var nodes = nodes; |
|
248 | var nodes = nodes; | |
249 | var q_filter_field = YUD.get(target); |
|
249 | var q_filter_field = YUD.get(target); | |
250 | var F = YAHOO.namespace(target); |
|
250 | var F = YAHOO.namespace(target); | |
251 |
|
251 | |||
252 | YUE.on(q_filter_field,'click',function(){ |
|
252 | YUE.on(q_filter_field,'click',function(){ | |
253 | q_filter_field.value = ''; |
|
253 | q_filter_field.value = ''; | |
254 | }); |
|
254 | }); | |
255 |
|
255 | |||
256 | YUE.on(q_filter_field,'keyup',function(e){ |
|
256 | YUE.on(q_filter_field,'keyup',function(e){ | |
257 | clearTimeout(F.filterTimeout); |
|
257 | clearTimeout(F.filterTimeout); | |
258 | F.filterTimeout = setTimeout(F.updateFilter,600); |
|
258 | F.filterTimeout = setTimeout(F.updateFilter,600); | |
259 | }); |
|
259 | }); | |
260 |
|
260 | |||
261 | F.filterTimeout = null; |
|
261 | F.filterTimeout = null; | |
262 |
|
262 | |||
263 | var show_node = function(node){ |
|
263 | var show_node = function(node){ | |
264 | YUD.setStyle(node,'display','') |
|
264 | YUD.setStyle(node,'display','') | |
265 | } |
|
265 | } | |
266 | var hide_node = function(node){ |
|
266 | var hide_node = function(node){ | |
267 | YUD.setStyle(node,'display','none'); |
|
267 | YUD.setStyle(node,'display','none'); | |
268 | } |
|
268 | } | |
269 |
|
269 | |||
270 | F.updateFilter = function() { |
|
270 | F.updateFilter = function() { | |
271 | // Reset timeout |
|
271 | // Reset timeout | |
272 | F.filterTimeout = null; |
|
272 | F.filterTimeout = null; | |
273 |
|
273 | |||
274 | var obsolete = []; |
|
274 | var obsolete = []; | |
275 |
|
275 | |||
276 | var req = q_filter_field.value.toLowerCase(); |
|
276 | var req = q_filter_field.value.toLowerCase(); | |
277 |
|
277 | |||
278 | var l = nodes.length; |
|
278 | var l = nodes.length; | |
279 | var i; |
|
279 | var i; | |
280 | var showing = 0; |
|
280 | var showing = 0; | |
281 |
|
281 | |||
282 | for (i=0;i<l;i++ ){ |
|
282 | for (i=0;i<l;i++ ){ | |
283 | var n = nodes[i]; |
|
283 | var n = nodes[i]; | |
284 | var target_element = display_element(n) |
|
284 | var target_element = display_element(n) | |
285 | if(req && n.innerHTML.toLowerCase().indexOf(req) == -1){ |
|
285 | if(req && n.innerHTML.toLowerCase().indexOf(req) == -1){ | |
286 | hide_node(target_element); |
|
286 | hide_node(target_element); | |
287 | } |
|
287 | } | |
288 | else{ |
|
288 | else{ | |
289 | show_node(target_element); |
|
289 | show_node(target_element); | |
290 | showing+=1; |
|
290 | showing+=1; | |
291 | } |
|
291 | } | |
292 | } |
|
292 | } | |
293 |
|
293 | |||
294 | // if repo_count is set update the number |
|
294 | // if repo_count is set update the number | |
295 | var cnt = YUD.get('repo_count'); |
|
295 | var cnt = YUD.get('repo_count'); | |
296 | if(cnt){ |
|
296 | if(cnt){ | |
297 | YUD.get('repo_count').innerHTML = showing; |
|
297 | YUD.get('repo_count').innerHTML = showing; | |
298 | } |
|
298 | } | |
299 |
|
299 | |||
300 | } |
|
300 | } | |
301 | }; |
|
301 | }; | |
302 |
|
302 | |||
303 | var ajaxPOST = function(url,postData,success) { |
|
303 | var ajaxPOST = function(url,postData,success) { | |
304 | var sUrl = url; |
|
304 | var sUrl = url; | |
305 | var callback = { |
|
305 | var callback = { | |
306 | success: success, |
|
306 | success: success, | |
307 | failure: function (o) { |
|
307 | failure: function (o) { | |
308 | alert("error"); |
|
308 | alert("error"); | |
309 | }, |
|
309 | }, | |
310 | }; |
|
310 | }; | |
311 | var postData = postData; |
|
311 | var postData = postData; | |
312 | var request = YAHOO.util.Connect.asyncRequest('POST', sUrl, callback, postData); |
|
312 | var request = YAHOO.util.Connect.asyncRequest('POST', sUrl, callback, postData); | |
313 | }; |
|
313 | }; | |
314 |
|
314 | |||
315 |
|
315 | |||
316 | /** comments **/ |
|
316 | /** comments **/ | |
317 | var removeInlineForm = function(form) { |
|
317 | var removeInlineForm = function(form) { | |
318 | form.parentNode.removeChild(form); |
|
318 | form.parentNode.removeChild(form); | |
319 | }; |
|
319 | }; | |
320 |
|
320 | |||
321 | var tableTr = function(cls,body){ |
|
321 | var tableTr = function(cls,body){ | |
322 | var form = document.createElement('tr'); |
|
322 | var form = document.createElement('tr'); | |
323 | YUD.addClass(form, cls); |
|
323 | YUD.addClass(form, cls); | |
324 | form.innerHTML = '<td class="lineno-inline new-inline"></td>'+ |
|
324 | form.innerHTML = '<td class="lineno-inline new-inline"></td>'+ | |
325 | '<td class="lineno-inline old-inline"></td>'+ |
|
325 | '<td class="lineno-inline old-inline"></td>'+ | |
326 | '<td>{0}</td>'.format(body); |
|
326 | '<td>{0}</td>'.format(body); | |
327 | return form; |
|
327 | return form; | |
328 | }; |
|
328 | }; | |
329 |
|
329 | |||
330 | var createInlineForm = function(parent_tr, f_path, line) { |
|
330 | var createInlineForm = function(parent_tr, f_path, line) { | |
331 | var tmpl = YUD.get('comment-inline-form-template').innerHTML; |
|
331 | var tmpl = YUD.get('comment-inline-form-template').innerHTML; | |
332 | tmpl = tmpl.format(f_path, line); |
|
332 | tmpl = tmpl.format(f_path, line); | |
333 | var form = tableTr('comment-form-inline',tmpl) |
|
333 | var form = tableTr('comment-form-inline',tmpl) | |
334 |
|
334 | |||
335 | // create event for hide button |
|
335 | // create event for hide button | |
336 | form = new YAHOO.util.Element(form); |
|
336 | form = new YAHOO.util.Element(form); | |
337 | var form_hide_button = new YAHOO.util.Element(form.getElementsByClassName('hide-inline-form')[0]); |
|
337 | var form_hide_button = new YAHOO.util.Element(form.getElementsByClassName('hide-inline-form')[0]); | |
338 | form_hide_button.on('click', function(e) { |
|
338 | form_hide_button.on('click', function(e) { | |
339 | var newtr = e.currentTarget.parentNode.parentNode.parentNode.parentNode.parentNode; |
|
339 | var newtr = e.currentTarget.parentNode.parentNode.parentNode.parentNode.parentNode; | |
340 | removeInlineForm(newtr); |
|
340 | removeInlineForm(newtr); | |
341 | YUD.removeClass(parent_tr, 'form-open'); |
|
341 | YUD.removeClass(parent_tr, 'form-open'); | |
342 | }); |
|
342 | }); | |
343 | return form |
|
343 | return form | |
344 | }; |
|
344 | }; | |
345 | var injectInlineForm = function(tr){ |
|
345 | var injectInlineForm = function(tr){ | |
346 | if(YUD.hasClass(tr,'form-open') || YUD.hasClass(tr,'context') || YUD.hasClass(tr,'no-comment')){ |
|
346 | if(YUD.hasClass(tr,'form-open') || YUD.hasClass(tr,'context') || YUD.hasClass(tr,'no-comment')){ | |
347 | return |
|
347 | return | |
348 | } |
|
348 | } | |
349 | YUD.addClass(tr,'form-open'); |
|
349 | YUD.addClass(tr,'form-open'); | |
350 | var node = tr.parentNode.parentNode.parentNode.getElementsByClassName('full_f_path')[0]; |
|
350 | var node = tr.parentNode.parentNode.parentNode.getElementsByClassName('full_f_path')[0]; | |
351 | var f_path = YUD.getAttribute(node,'path'); |
|
351 | var f_path = YUD.getAttribute(node,'path'); | |
352 | var lineno = getLineNo(tr); |
|
352 | var lineno = getLineNo(tr); | |
353 | var form = createInlineForm(tr, f_path, lineno); |
|
353 | var form = createInlineForm(tr, f_path, lineno); | |
354 | var target_tr = tr; |
|
354 | var target_tr = tr; | |
355 | if(YUD.hasClass(YUD.getNextSibling(tr),'inline-comments')){ |
|
355 | if(YUD.hasClass(YUD.getNextSibling(tr),'inline-comments')){ | |
356 | target_tr = YUD.getNextSibling(tr); |
|
356 | target_tr = YUD.getNextSibling(tr); | |
357 | } |
|
357 | } | |
358 | YUD.insertAfter(form,target_tr); |
|
358 | YUD.insertAfter(form,target_tr); | |
359 | YUD.get('text_'+lineno).focus(); |
|
359 | YUD.get('text_'+lineno).focus(); | |
360 | tooltip_activate(); |
|
360 | tooltip_activate(); | |
361 | }; |
|
361 | }; | |
362 |
|
362 | |||
363 | var createInlineAddButton = function(tr,label){ |
|
363 | var createInlineAddButton = function(tr,label){ | |
364 | var html = '<div class="add-comment"><span class="ui-btn">{0}</span></div>'.format(label); |
|
364 | var html = '<div class="add-comment"><span class="ui-btn">{0}</span></div>'.format(label); | |
365 |
|
365 | |||
366 | var add = new YAHOO.util.Element(tableTr('inline-comments-button',html)); |
|
366 | var add = new YAHOO.util.Element(tableTr('inline-comments-button',html)); | |
367 | add.on('click', function(e) { |
|
367 | add.on('click', function(e) { | |
368 | injectInlineForm(tr); |
|
368 | injectInlineForm(tr); | |
369 | }); |
|
369 | }); | |
370 | return add; |
|
370 | return add; | |
371 | }; |
|
371 | }; | |
372 |
|
372 | |||
373 | var getLineNo = function(tr) { |
|
373 | var getLineNo = function(tr) { | |
374 | var line; |
|
374 | var line; | |
375 | var o = tr.children[0].id.split('_'); |
|
375 | var o = tr.children[0].id.split('_'); | |
376 | var n = tr.children[1].id.split('_'); |
|
376 | var n = tr.children[1].id.split('_'); | |
377 |
|
377 | |||
378 | if (n.length >= 2) { |
|
378 | if (n.length >= 2) { | |
379 | line = n[n.length-1]; |
|
379 | line = n[n.length-1]; | |
380 | } else if (o.length >= 2) { |
|
380 | } else if (o.length >= 2) { | |
381 | line = o[o.length-1]; |
|
381 | line = o[o.length-1]; | |
382 | } |
|
382 | } | |
383 |
|
383 | |||
384 | return line |
|
384 | return line | |
385 | }; |
|
385 | }; | |
386 |
|
386 | |||
387 |
|
387 | |||
388 | var fileBrowserListeners = function(current_url, node_list_url, url_base, |
|
388 | var fileBrowserListeners = function(current_url, node_list_url, url_base, | |
389 | truncated_lbl, nomatch_lbl){ |
|
389 | truncated_lbl, nomatch_lbl){ | |
390 | var current_url_branch = +"?branch=__BRANCH__"; |
|
390 | var current_url_branch = +"?branch=__BRANCH__"; | |
391 | var url = url_base; |
|
391 | var url = url_base; | |
392 | var node_url = node_list_url; |
|
392 | var node_url = node_list_url; | |
393 |
|
393 | |||
394 | YUE.on('stay_at_branch','click',function(e){ |
|
394 | YUE.on('stay_at_branch','click',function(e){ | |
395 | if(e.target.checked){ |
|
395 | if(e.target.checked){ | |
396 | var uri = current_url_branch; |
|
396 | var uri = current_url_branch; | |
397 | uri = uri.replace('__BRANCH__',e.target.value); |
|
397 | uri = uri.replace('__BRANCH__',e.target.value); | |
398 | window.location = uri; |
|
398 | window.location = uri; | |
399 | } |
|
399 | } | |
400 | else{ |
|
400 | else{ | |
401 | window.location = current_url; |
|
401 | window.location = current_url; | |
402 | } |
|
402 | } | |
403 | }) |
|
403 | }) | |
404 |
|
404 | |||
405 | var n_filter = YUD.get('node_filter'); |
|
405 | var n_filter = YUD.get('node_filter'); | |
406 | var F = YAHOO.namespace('node_filter'); |
|
406 | var F = YAHOO.namespace('node_filter'); | |
407 |
|
407 | |||
408 | F.filterTimeout = null; |
|
408 | F.filterTimeout = null; | |
409 | var nodes = null; |
|
409 | var nodes = null; | |
410 |
|
410 | |||
411 | F.initFilter = function(){ |
|
411 | F.initFilter = function(){ | |
412 | YUD.setStyle('node_filter_box_loading','display',''); |
|
412 | YUD.setStyle('node_filter_box_loading','display',''); | |
413 | YUD.setStyle('search_activate_id','display','none'); |
|
413 | YUD.setStyle('search_activate_id','display','none'); | |
414 | YUD.setStyle('add_node_id','display','none'); |
|
414 | YUD.setStyle('add_node_id','display','none'); | |
415 | YUC.initHeader('X-PARTIAL-XHR',true); |
|
415 | YUC.initHeader('X-PARTIAL-XHR',true); | |
416 | YUC.asyncRequest('GET',url,{ |
|
416 | YUC.asyncRequest('GET',url,{ | |
417 | success:function(o){ |
|
417 | success:function(o){ | |
418 | nodes = JSON.parse(o.responseText); |
|
418 | nodes = JSON.parse(o.responseText); | |
419 | YUD.setStyle('node_filter_box_loading','display','none'); |
|
419 | YUD.setStyle('node_filter_box_loading','display','none'); | |
420 | YUD.setStyle('node_filter_box','display',''); |
|
420 | YUD.setStyle('node_filter_box','display',''); | |
421 | n_filter.focus(); |
|
421 | n_filter.focus(); | |
422 | if(YUD.hasClass(n_filter,'init')){ |
|
422 | if(YUD.hasClass(n_filter,'init')){ | |
423 | n_filter.value = ''; |
|
423 | n_filter.value = ''; | |
424 | YUD.removeClass(n_filter,'init'); |
|
424 | YUD.removeClass(n_filter,'init'); | |
425 | } |
|
425 | } | |
426 | }, |
|
426 | }, | |
427 | failure:function(o){ |
|
427 | failure:function(o){ | |
428 | console.log('failed to load'); |
|
428 | console.log('failed to load'); | |
429 | } |
|
429 | } | |
430 | },null); |
|
430 | },null); | |
431 | } |
|
431 | } | |
432 |
|
432 | |||
433 | F.updateFilter = function(e) { |
|
433 | F.updateFilter = function(e) { | |
434 |
|
434 | |||
435 | return function(){ |
|
435 | return function(){ | |
436 | // Reset timeout |
|
436 | // Reset timeout | |
437 | F.filterTimeout = null; |
|
437 | F.filterTimeout = null; | |
438 | var query = e.target.value.toLowerCase(); |
|
438 | var query = e.target.value.toLowerCase(); | |
439 | var match = []; |
|
439 | var match = []; | |
440 | var matches = 0; |
|
440 | var matches = 0; | |
441 | var matches_max = 20; |
|
441 | var matches_max = 20; | |
442 | if (query != ""){ |
|
442 | if (query != ""){ | |
443 | for(var i=0;i<nodes.length;i++){ |
|
443 | for(var i=0;i<nodes.length;i++){ | |
444 |
|
444 | |||
445 | var pos = nodes[i].name.toLowerCase().indexOf(query) |
|
445 | var pos = nodes[i].name.toLowerCase().indexOf(query) | |
446 | if(query && pos != -1){ |
|
446 | if(query && pos != -1){ | |
447 |
|
447 | |||
448 | matches++ |
|
448 | matches++ | |
449 | //show only certain amount to not kill browser |
|
449 | //show only certain amount to not kill browser | |
450 | if (matches > matches_max){ |
|
450 | if (matches > matches_max){ | |
451 | break; |
|
451 | break; | |
452 | } |
|
452 | } | |
453 |
|
453 | |||
454 | var n = nodes[i].name; |
|
454 | var n = nodes[i].name; | |
455 | var t = nodes[i].type; |
|
455 | var t = nodes[i].type; | |
456 | var n_hl = n.substring(0,pos) |
|
456 | var n_hl = n.substring(0,pos) | |
457 | +"<b>{0}</b>".format(n.substring(pos,pos+query.length)) |
|
457 | +"<b>{0}</b>".format(n.substring(pos,pos+query.length)) | |
458 | +n.substring(pos+query.length) |
|
458 | +n.substring(pos+query.length) | |
459 | match.push('<tr><td><a class="browser-{0}" href="{1}">{2}</a></td><td colspan="5"></td></tr>'.format(t,node_url.replace('__FPATH__',n),n_hl)); |
|
459 | match.push('<tr><td><a class="browser-{0}" href="{1}">{2}</a></td><td colspan="5"></td></tr>'.format(t,node_url.replace('__FPATH__',n),n_hl)); | |
460 | } |
|
460 | } | |
461 | if(match.length >= matches_max){ |
|
461 | if(match.length >= matches_max){ | |
462 | match.push('<tr><td>{0}</td><td colspan="5"></td></tr>'.format(truncated_lbl)); |
|
462 | match.push('<tr><td>{0}</td><td colspan="5"></td></tr>'.format(truncated_lbl)); | |
463 | } |
|
463 | } | |
464 |
|
464 | |||
465 | } |
|
465 | } | |
466 | } |
|
466 | } | |
467 | if(query != ""){ |
|
467 | if(query != ""){ | |
468 | YUD.setStyle('tbody','display','none'); |
|
468 | YUD.setStyle('tbody','display','none'); | |
469 | YUD.setStyle('tbody_filtered','display',''); |
|
469 | YUD.setStyle('tbody_filtered','display',''); | |
470 |
|
470 | |||
471 | if (match.length==0){ |
|
471 | if (match.length==0){ | |
472 | match.push('<tr><td>{0}</td><td colspan="5"></td></tr>'.format(nomatch_lbl)); |
|
472 | match.push('<tr><td>{0}</td><td colspan="5"></td></tr>'.format(nomatch_lbl)); | |
473 | } |
|
473 | } | |
474 |
|
474 | |||
475 | YUD.get('tbody_filtered').innerHTML = match.join(""); |
|
475 | YUD.get('tbody_filtered').innerHTML = match.join(""); | |
476 | } |
|
476 | } | |
477 | else{ |
|
477 | else{ | |
478 | YUD.setStyle('tbody','display',''); |
|
478 | YUD.setStyle('tbody','display',''); | |
479 | YUD.setStyle('tbody_filtered','display','none'); |
|
479 | YUD.setStyle('tbody_filtered','display','none'); | |
480 | } |
|
480 | } | |
481 |
|
481 | |||
482 | } |
|
482 | } | |
483 | }; |
|
483 | }; | |
484 |
|
484 | |||
485 | YUE.on(YUD.get('filter_activate'),'click',function(){ |
|
485 | YUE.on(YUD.get('filter_activate'),'click',function(){ | |
486 | F.initFilter(); |
|
486 | F.initFilter(); | |
487 | }) |
|
487 | }) | |
488 | YUE.on(n_filter,'click',function(){ |
|
488 | YUE.on(n_filter,'click',function(){ | |
489 | if(YUD.hasClass(n_filter,'init')){ |
|
489 | if(YUD.hasClass(n_filter,'init')){ | |
490 | n_filter.value = ''; |
|
490 | n_filter.value = ''; | |
491 | YUD.removeClass(n_filter,'init'); |
|
491 | YUD.removeClass(n_filter,'init'); | |
492 | } |
|
492 | } | |
493 | }); |
|
493 | }); | |
494 | YUE.on(n_filter,'keyup',function(e){ |
|
494 | YUE.on(n_filter,'keyup',function(e){ | |
495 | clearTimeout(F.filterTimeout); |
|
495 | clearTimeout(F.filterTimeout); | |
496 | F.filterTimeout = setTimeout(F.updateFilter(e),600); |
|
496 | F.filterTimeout = setTimeout(F.updateFilter(e),600); | |
497 | }); |
|
497 | }); | |
498 | }; |
|
498 | }; | |
499 |
|
499 | |||
500 |
|
500 | |||
501 | var initCodeMirror = function(textAreadId,resetUrl){ |
|
501 | var initCodeMirror = function(textAreadId,resetUrl){ | |
502 | var myCodeMirror = CodeMirror.fromTextArea(YUD.get(textAreadId),{ |
|
502 | var myCodeMirror = CodeMirror.fromTextArea(YUD.get(textAreadId),{ | |
503 | mode: "null", |
|
503 | mode: "null", | |
504 | lineNumbers:true |
|
504 | lineNumbers:true | |
505 | }); |
|
505 | }); | |
506 | YUE.on('reset','click',function(e){ |
|
506 | YUE.on('reset','click',function(e){ | |
507 | window.location=resetUrl |
|
507 | window.location=resetUrl | |
508 | }); |
|
508 | }); | |
509 |
|
509 | |||
510 | YUE.on('file_enable','click',function(){ |
|
510 | YUE.on('file_enable','click',function(){ | |
511 | YUD.setStyle('editor_container','display',''); |
|
511 | YUD.setStyle('editor_container','display',''); | |
512 | YUD.setStyle('upload_file_container','display','none'); |
|
512 | YUD.setStyle('upload_file_container','display','none'); | |
513 | YUD.setStyle('filename_container','display',''); |
|
513 | YUD.setStyle('filename_container','display',''); | |
514 | }); |
|
514 | }); | |
515 |
|
515 | |||
516 | YUE.on('upload_file_enable','click',function(){ |
|
516 | YUE.on('upload_file_enable','click',function(){ | |
517 | YUD.setStyle('editor_container','display','none'); |
|
517 | YUD.setStyle('editor_container','display','none'); | |
518 | YUD.setStyle('upload_file_container','display',''); |
|
518 | YUD.setStyle('upload_file_container','display',''); | |
519 | YUD.setStyle('filename_container','display','none'); |
|
519 | YUD.setStyle('filename_container','display','none'); | |
520 | }); |
|
520 | }); | |
521 | }; |
|
521 | }; | |
522 |
|
522 | |||
523 |
|
523 | |||
524 |
|
524 | |||
525 | var getIdentNode = function(n){ |
|
525 | var getIdentNode = function(n){ | |
526 | //iterate thru nodes untill matched interesting node ! |
|
526 | //iterate thru nodes untill matched interesting node ! | |
527 |
|
527 | |||
528 | if (typeof n == 'undefined'){ |
|
528 | if (typeof n == 'undefined'){ | |
529 | return -1 |
|
529 | return -1 | |
530 | } |
|
530 | } | |
531 |
|
531 | |||
532 | if(typeof n.id != "undefined" && n.id.match('L[0-9]+')){ |
|
532 | if(typeof n.id != "undefined" && n.id.match('L[0-9]+')){ | |
533 | return n |
|
533 | return n | |
534 | } |
|
534 | } | |
535 | else{ |
|
535 | else{ | |
536 | return getIdentNode(n.parentNode); |
|
536 | return getIdentNode(n.parentNode); | |
537 | } |
|
537 | } | |
538 | }; |
|
538 | }; | |
539 |
|
539 | |||
540 | var getSelectionLink = function(selection_link_label) { |
|
540 | var getSelectionLink = function(selection_link_label) { | |
541 | return function(){ |
|
541 | return function(){ | |
542 | //get selection from start/to nodes |
|
542 | //get selection from start/to nodes | |
543 | if (typeof window.getSelection != "undefined") { |
|
543 | if (typeof window.getSelection != "undefined") { | |
544 | s = window.getSelection(); |
|
544 | s = window.getSelection(); | |
545 |
|
545 | |||
546 | from = getIdentNode(s.anchorNode); |
|
546 | from = getIdentNode(s.anchorNode); | |
547 | till = getIdentNode(s.focusNode); |
|
547 | till = getIdentNode(s.focusNode); | |
548 |
|
548 | |||
549 | f_int = parseInt(from.id.replace('L','')); |
|
549 | f_int = parseInt(from.id.replace('L','')); | |
550 | t_int = parseInt(till.id.replace('L','')); |
|
550 | t_int = parseInt(till.id.replace('L','')); | |
551 |
|
551 | |||
552 | if (f_int > t_int){ |
|
552 | if (f_int > t_int){ | |
553 | //highlight from bottom |
|
553 | //highlight from bottom | |
554 | offset = -35; |
|
554 | offset = -35; | |
555 | ranges = [t_int,f_int]; |
|
555 | ranges = [t_int,f_int]; | |
556 |
|
556 | |||
557 | } |
|
557 | } | |
558 | else{ |
|
558 | else{ | |
559 | //highligth from top |
|
559 | //highligth from top | |
560 | offset = 35; |
|
560 | offset = 35; | |
561 | ranges = [f_int,t_int]; |
|
561 | ranges = [f_int,t_int]; | |
562 | } |
|
562 | } | |
563 |
|
563 | |||
564 | if (ranges[0] != ranges[1]){ |
|
564 | if (ranges[0] != ranges[1]){ | |
565 | if(YUD.get('linktt') == null){ |
|
565 | if(YUD.get('linktt') == null){ | |
566 | hl_div = document.createElement('div'); |
|
566 | hl_div = document.createElement('div'); | |
567 | hl_div.id = 'linktt'; |
|
567 | hl_div.id = 'linktt'; | |
568 | } |
|
568 | } | |
569 | anchor = '#L'+ranges[0]+'-'+ranges[1]; |
|
569 | anchor = '#L'+ranges[0]+'-'+ranges[1]; | |
570 | hl_div.innerHTML = ''; |
|
570 | hl_div.innerHTML = ''; | |
571 | l = document.createElement('a'); |
|
571 | l = document.createElement('a'); | |
572 | l.href = location.href.substring(0,location.href.indexOf('#'))+anchor; |
|
572 | l.href = location.href.substring(0,location.href.indexOf('#'))+anchor; | |
573 | l.innerHTML = selection_link_label; |
|
573 | l.innerHTML = selection_link_label; | |
574 | hl_div.appendChild(l); |
|
574 | hl_div.appendChild(l); | |
575 |
|
575 | |||
576 | YUD.get('body').appendChild(hl_div); |
|
576 | YUD.get('body').appendChild(hl_div); | |
577 |
|
577 | |||
578 | xy = YUD.getXY(till.id); |
|
578 | xy = YUD.getXY(till.id); | |
579 |
|
579 | |||
580 | YUD.addClass('linktt','yui-tt'); |
|
580 | YUD.addClass('linktt','yui-tt'); | |
581 | YUD.setStyle('linktt','top',xy[1]+offset+'px'); |
|
581 | YUD.setStyle('linktt','top',xy[1]+offset+'px'); | |
582 | YUD.setStyle('linktt','left',xy[0]+'px'); |
|
582 | YUD.setStyle('linktt','left',xy[0]+'px'); | |
583 | YUD.setStyle('linktt','visibility','visible'); |
|
583 | YUD.setStyle('linktt','visibility','visible'); | |
584 | } |
|
584 | } | |
585 | else{ |
|
585 | else{ | |
586 | YUD.setStyle('linktt','visibility','hidden'); |
|
586 | YUD.setStyle('linktt','visibility','hidden'); | |
587 | } |
|
587 | } | |
588 | } |
|
588 | } | |
589 | } |
|
589 | } | |
590 | }; |
|
590 | }; | |
591 |
|
591 | |||
592 | var deleteNotification = function(url, notification_id,callbacks){ |
|
592 | var deleteNotification = function(url, notification_id,callbacks){ | |
593 | var callback = { |
|
593 | var callback = { | |
594 | success:function(o){ |
|
594 | success:function(o){ | |
595 | var obj = YUD.get(String("notification_"+notification_id)); |
|
595 | var obj = YUD.get(String("notification_"+notification_id)); | |
596 | if(obj.parentNode !== undefined){ |
|
596 | if(obj.parentNode !== undefined){ | |
597 | obj.parentNode.removeChild(obj); |
|
597 | obj.parentNode.removeChild(obj); | |
598 | } |
|
598 | } | |
599 | _run_callbacks(callbacks); |
|
599 | _run_callbacks(callbacks); | |
600 | }, |
|
600 | }, | |
601 | failure:function(o){ |
|
601 | failure:function(o){ | |
602 | alert("error"); |
|
602 | alert("error"); | |
603 | }, |
|
603 | }, | |
604 | }; |
|
604 | }; | |
605 | var postData = '_method=delete'; |
|
605 | var postData = '_method=delete'; | |
606 | var sUrl = url.replace('__NOTIFICATION_ID__',notification_id); |
|
606 | var sUrl = url.replace('__NOTIFICATION_ID__',notification_id); | |
607 | var request = YAHOO.util.Connect.asyncRequest('POST', sUrl, |
|
607 | var request = YAHOO.util.Connect.asyncRequest('POST', sUrl, | |
608 | callback, postData); |
|
608 | callback, postData); | |
609 | }; |
|
609 | }; | |
610 |
|
610 | |||
611 |
|
611 | |||
612 | /** |
|
612 | /** | |
613 | * QUICK REPO MENU |
|
613 | * QUICK REPO MENU | |
614 | */ |
|
614 | */ | |
615 | var quick_repo_menu = function(){ |
|
615 | var quick_repo_menu = function(){ | |
616 |
YUE.on(YUQ('.quick_repo_menu'),' |
|
616 | YUE.on(YUQ('.quick_repo_menu'),'mouseenter',function(e){ | |
617 | var menu = e.currentTarget.firstElementChild.firstElementChild; |
|
617 | var menu = e.currentTarget.firstElementChild.firstElementChild; | |
618 | if(YUD.hasClass(menu,'hidden')){ |
|
618 | if(YUD.hasClass(menu,'hidden')){ | |
619 |
YUD. |
|
619 | YUD.replaceClass(e.currentTarget,'hidden', 'active'); | |
620 |
YUD.re |
|
620 | YUD.replaceClass(menu, 'hidden', 'active'); | |
621 |
} |
|
621 | } | |
622 | YUD.removeClass(e.currentTarget,'active'); |
|
622 | }) | |
623 | YUD.addClass(menu,'hidden'); |
|
623 | YUE.on(YUQ('.quick_repo_menu'),'mouseleave',function(e){ | |
|
624 | var menu = e.currentTarget.firstElementChild.firstElementChild; | |||
|
625 | if(YUD.hasClass(menu,'active')){ | |||
|
626 | YUD.replaceClass(e.currentTarget, 'active', 'hidden'); | |||
|
627 | YUD.replaceClass(menu, 'active', 'hidden'); | |||
624 | } |
|
628 | } | |
625 | }) |
|
629 | }) | |
626 | }; |
|
630 | }; | |
627 |
|
631 | |||
628 |
|
632 | |||
629 | /** |
|
633 | /** | |
630 | * TABLE SORTING |
|
634 | * TABLE SORTING | |
631 | */ |
|
635 | */ | |
632 |
|
636 | |||
633 | // returns a node from given html; |
|
637 | // returns a node from given html; | |
634 | var fromHTML = function(html){ |
|
638 | var fromHTML = function(html){ | |
635 | var _html = document.createElement('element'); |
|
639 | var _html = document.createElement('element'); | |
636 | _html.innerHTML = html; |
|
640 | _html.innerHTML = html; | |
637 | return _html; |
|
641 | return _html; | |
638 | } |
|
642 | } | |
639 | var get_rev = function(node){ |
|
643 | var get_rev = function(node){ | |
640 | var n = node.firstElementChild.firstElementChild; |
|
644 | var n = node.firstElementChild.firstElementChild; | |
641 |
|
645 | |||
642 | if (n===null){ |
|
646 | if (n===null){ | |
643 | return -1 |
|
647 | return -1 | |
644 | } |
|
648 | } | |
645 | else{ |
|
649 | else{ | |
646 | out = n.firstElementChild.innerHTML.split(':')[0].replace('r',''); |
|
650 | out = n.firstElementChild.innerHTML.split(':')[0].replace('r',''); | |
647 | return parseInt(out); |
|
651 | return parseInt(out); | |
648 | } |
|
652 | } | |
649 | } |
|
653 | } | |
650 |
|
654 | |||
651 | var get_name = function(node){ |
|
655 | var get_name = function(node){ | |
652 | var name = node.firstElementChild.children[2].innerHTML; |
|
656 | var name = node.firstElementChild.children[2].innerHTML; | |
653 | return name |
|
657 | return name | |
654 | } |
|
658 | } | |
655 | var get_group_name = function(node){ |
|
659 | var get_group_name = function(node){ | |
656 | var name = node.firstElementChild.children[1].innerHTML; |
|
660 | var name = node.firstElementChild.children[1].innerHTML; | |
657 | return name |
|
661 | return name | |
658 | } |
|
662 | } | |
659 | var get_date = function(node){ |
|
663 | var get_date = function(node){ | |
660 | var date_ = node.firstElementChild.innerHTML; |
|
664 | var date_ = node.firstElementChild.innerHTML; | |
661 | return date_ |
|
665 | return date_ | |
662 | } |
|
666 | } | |
663 |
|
667 | |||
664 | var revisionSort = function(a, b, desc, field) { |
|
668 | var revisionSort = function(a, b, desc, field) { | |
665 |
|
669 | |||
666 | var a_ = fromHTML(a.getData(field)); |
|
670 | var a_ = fromHTML(a.getData(field)); | |
667 | var b_ = fromHTML(b.getData(field)); |
|
671 | var b_ = fromHTML(b.getData(field)); | |
668 |
|
672 | |||
669 | // extract revisions from string nodes |
|
673 | // extract revisions from string nodes | |
670 | a_ = get_rev(a_) |
|
674 | a_ = get_rev(a_) | |
671 | b_ = get_rev(b_) |
|
675 | b_ = get_rev(b_) | |
672 |
|
676 | |||
673 | var comp = YAHOO.util.Sort.compare; |
|
677 | var comp = YAHOO.util.Sort.compare; | |
674 | var compState = comp(a_, b_, desc); |
|
678 | var compState = comp(a_, b_, desc); | |
675 | return compState; |
|
679 | return compState; | |
676 | }; |
|
680 | }; | |
677 | var ageSort = function(a, b, desc, field) { |
|
681 | var ageSort = function(a, b, desc, field) { | |
678 | var a_ = a.getData(field); |
|
682 | var a_ = a.getData(field); | |
679 | var b_ = b.getData(field); |
|
683 | var b_ = b.getData(field); | |
680 |
|
684 | |||
681 | var comp = YAHOO.util.Sort.compare; |
|
685 | var comp = YAHOO.util.Sort.compare; | |
682 | var compState = comp(a_, b_, desc); |
|
686 | var compState = comp(a_, b_, desc); | |
683 | return compState; |
|
687 | return compState; | |
684 | }; |
|
688 | }; | |
685 |
|
689 | |||
686 | var nameSort = function(a, b, desc, field) { |
|
690 | var nameSort = function(a, b, desc, field) { | |
687 | var a_ = fromHTML(a.getData(field)); |
|
691 | var a_ = fromHTML(a.getData(field)); | |
688 | var b_ = fromHTML(b.getData(field)); |
|
692 | var b_ = fromHTML(b.getData(field)); | |
689 |
|
693 | |||
690 | // extract name from table |
|
694 | // extract name from table | |
691 | a_ = get_name(a_) |
|
695 | a_ = get_name(a_) | |
692 | b_ = get_name(b_) |
|
696 | b_ = get_name(b_) | |
693 |
|
697 | |||
694 | var comp = YAHOO.util.Sort.compare; |
|
698 | var comp = YAHOO.util.Sort.compare; | |
695 | var compState = comp(a_, b_, desc); |
|
699 | var compState = comp(a_, b_, desc); | |
696 | return compState; |
|
700 | return compState; | |
697 | }; |
|
701 | }; | |
698 |
|
702 | |||
699 | var groupNameSort = function(a, b, desc, field) { |
|
703 | var groupNameSort = function(a, b, desc, field) { | |
700 | var a_ = fromHTML(a.getData(field)); |
|
704 | var a_ = fromHTML(a.getData(field)); | |
701 | var b_ = fromHTML(b.getData(field)); |
|
705 | var b_ = fromHTML(b.getData(field)); | |
702 |
|
706 | |||
703 | // extract name from table |
|
707 | // extract name from table | |
704 | a_ = get_group_name(a_) |
|
708 | a_ = get_group_name(a_) | |
705 | b_ = get_group_name(b_) |
|
709 | b_ = get_group_name(b_) | |
706 |
|
710 | |||
707 | var comp = YAHOO.util.Sort.compare; |
|
711 | var comp = YAHOO.util.Sort.compare; | |
708 | var compState = comp(a_, b_, desc); |
|
712 | var compState = comp(a_, b_, desc); | |
709 | return compState; |
|
713 | return compState; | |
710 | }; |
|
714 | }; | |
711 | var dateSort = function(a, b, desc, field) { |
|
715 | var dateSort = function(a, b, desc, field) { | |
712 | var a_ = fromHTML(a.getData(field)); |
|
716 | var a_ = fromHTML(a.getData(field)); | |
713 | var b_ = fromHTML(b.getData(field)); |
|
717 | var b_ = fromHTML(b.getData(field)); | |
714 |
|
718 | |||
715 | // extract name from table |
|
719 | // extract name from table | |
716 | a_ = get_date(a_) |
|
720 | a_ = get_date(a_) | |
717 | b_ = get_date(b_) |
|
721 | b_ = get_date(b_) | |
718 |
|
722 | |||
719 | var comp = YAHOO.util.Sort.compare; |
|
723 | var comp = YAHOO.util.Sort.compare; | |
720 | var compState = comp(a_, b_, desc); |
|
724 | var compState = comp(a_, b_, desc); | |
721 | return compState; |
|
725 | return compState; | |
722 | }; No newline at end of file |
|
726 | }; |
@@ -1,8 +1,9 b'' | |||||
|
1 | %if h.is_hg(c.scm_type): | |||
1 | # ${c.scm_type.upper()} changeset patch |
|
2 | # ${c.scm_type.upper()} changeset patch | |
2 | # User ${c.changeset.author|n} |
|
3 | # User ${c.changeset.author|n} | |
3 | # Date ${c.changeset.date} |
|
4 | # Date ${c.changeset.date} | |
4 | # Node ID ${c.changeset.raw_id} |
|
5 | # Node ID ${c.changeset.raw_id} | |
5 | ${c.parent_tmpl} |
|
6 | ${c.parent_tmpl} | |
6 | ${c.changeset.message} |
|
7 | ${c.changeset.message} | |
7 |
|
8 | %endif | ||
8 | ${c.diffs|n} |
|
9 | ${c.diffs|n} |
@@ -1,136 +1,136 b'' | |||||
1 | <%inherit file="/base/base.html"/> |
|
1 | <%inherit file="/base/base.html"/> | |
2 |
|
2 | |||
3 | <%def name="title()"> |
|
3 | <%def name="title()"> | |
4 | ${c.repo_name} ${_('File annotate')} - ${c.rhodecode_name} |
|
4 | ${c.repo_name} ${_('File annotate')} - ${c.rhodecode_name} | |
5 | </%def> |
|
5 | </%def> | |
6 |
|
6 | |||
7 | <%def name="breadcrumbs_links()"> |
|
7 | <%def name="breadcrumbs_links()"> | |
8 | ${h.link_to(u'Home',h.url('/'))} |
|
8 | ${h.link_to(u'Home',h.url('/'))} | |
9 | » |
|
9 | » | |
10 | ${h.link_to(c.repo_name,h.url('summary_home',repo_name=c.repo_name))} |
|
10 | ${h.link_to(c.repo_name,h.url('summary_home',repo_name=c.repo_name))} | |
11 | » |
|
11 | » | |
12 | ${_('annotate')} @ R${c.cs.revision}:${h.short_id(c.cs.raw_id)} |
|
12 | ${_('annotate')} @ R${c.cs.revision}:${h.short_id(c.cs.raw_id)} | |
13 | </%def> |
|
13 | </%def> | |
14 |
|
14 | |||
15 | <%def name="page_nav()"> |
|
15 | <%def name="page_nav()"> | |
16 | ${self.menu('files')} |
|
16 | ${self.menu('files')} | |
17 | </%def> |
|
17 | </%def> | |
18 | <%def name="main()"> |
|
18 | <%def name="main()"> | |
19 | <div class="box"> |
|
19 | <div class="box"> | |
20 | <!-- box / title --> |
|
20 | <!-- box / title --> | |
21 | <div class="title"> |
|
21 | <div class="title"> | |
22 | ${self.breadcrumbs()} |
|
22 | ${self.breadcrumbs()} | |
23 | <ul class="links"> |
|
23 | <ul class="links"> | |
24 | <li> |
|
24 | <li> | |
25 | <span style="text-transform: uppercase;"><a href="#">${_('branch')}: ${c.cs.branch}</a></span> |
|
25 | <span style="text-transform: uppercase;"><a href="#">${_('branch')}: ${c.cs.branch}</a></span> | |
26 | </li> |
|
26 | </li> | |
27 | </ul> |
|
27 | </ul> | |
28 | </div> |
|
28 | </div> | |
29 | <div class="table"> |
|
29 | <div class="table"> | |
30 | <div id="files_data"> |
|
30 | <div id="files_data"> | |
31 | <h3 class="files_location">${_('Location')}: ${h.files_breadcrumbs(c.repo_name,c.cs.revision,c.file.path)}</h3> |
|
31 | <h3 class="files_location">${_('Location')}: ${h.files_breadcrumbs(c.repo_name,c.cs.revision,c.file.path)}</h3> | |
32 | <dl> |
|
32 | <dl> | |
33 | <dt style="padding-top:10px;font-size:16px">${_('History')}</dt> |
|
33 | <dt style="padding-top:10px;font-size:16px">${_('History')}</dt> | |
34 | <dd> |
|
34 | <dd> | |
35 | <div> |
|
35 | <div> | |
36 | ${h.form(h.url('files_diff_home',repo_name=c.repo_name,f_path=c.f_path),method='get')} |
|
36 | ${h.form(h.url('files_diff_home',repo_name=c.repo_name,f_path=c.f_path),method='get')} | |
37 |
${h.hidden('diff2',c.file. |
|
37 | ${h.hidden('diff2',c.file.changeset.raw_id)} | |
38 |
${h.select('diff1',c.file. |
|
38 | ${h.select('diff1',c.file.changeset.raw_id,c.file_history)} | |
39 | ${h.submit('diff','diff to revision',class_="ui-btn")} |
|
39 | ${h.submit('diff','diff to revision',class_="ui-btn")} | |
40 | ${h.submit('show_rev','show at revision',class_="ui-btn")} |
|
40 | ${h.submit('show_rev','show at revision',class_="ui-btn")} | |
41 | ${h.end_form()} |
|
41 | ${h.end_form()} | |
42 | </div> |
|
42 | </div> | |
43 | </dd> |
|
43 | </dd> | |
44 | </dl> |
|
44 | </dl> | |
45 | <div id="body" class="codeblock"> |
|
45 | <div id="body" class="codeblock"> | |
46 | <div class="code-header"> |
|
46 | <div class="code-header"> | |
47 | <div class="stats"> |
|
47 | <div class="stats"> | |
48 | <div class="left"><img src="${h.url('/images/icons/file.png')}"/></div> |
|
48 | <div class="left"><img src="${h.url('/images/icons/file.png')}"/></div> | |
49 |
<div class="left item">${h.link_to("r%s:%s" % (c.file. |
|
49 | <div class="left item">${h.link_to("r%s:%s" % (c.file.changeset.revision,h.short_id(c.file.changeset.raw_id)),h.url('changeset_home',repo_name=c.repo_name,revision=c.file.changeset.raw_id))}</div> | |
50 | <div class="left item">${h.format_byte_size(c.file.size,binary=True)}</div> |
|
50 | <div class="left item">${h.format_byte_size(c.file.size,binary=True)}</div> | |
51 | <div class="left item last">${c.file.mimetype}</div> |
|
51 | <div class="left item last">${c.file.mimetype}</div> | |
52 | <div class="buttons"> |
|
52 | <div class="buttons"> | |
53 | ${h.link_to(_('show source'),h.url('files_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.f_path),class_="ui-btn")} |
|
53 | ${h.link_to(_('show source'),h.url('files_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.f_path),class_="ui-btn")} | |
54 | ${h.link_to(_('show as raw'),h.url('files_raw_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.f_path),class_="ui-btn")} |
|
54 | ${h.link_to(_('show as raw'),h.url('files_raw_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.f_path),class_="ui-btn")} | |
55 | ${h.link_to(_('download as raw'),h.url('files_rawfile_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.f_path),class_="ui-btn")} |
|
55 | ${h.link_to(_('download as raw'),h.url('files_rawfile_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.f_path),class_="ui-btn")} | |
56 | % if h.HasRepoPermissionAny('repository.write','repository.admin')(c.repo_name): |
|
56 | % if h.HasRepoPermissionAny('repository.write','repository.admin')(c.repo_name): | |
57 | % if not c.file.is_binary: |
|
57 | % if not c.file.is_binary: | |
58 | ${h.link_to(_('edit'),h.url('files_edit_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.f_path),class_="ui-btn")} |
|
58 | ${h.link_to(_('edit'),h.url('files_edit_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.f_path),class_="ui-btn")} | |
59 | % endif |
|
59 | % endif | |
60 | % endif |
|
60 | % endif | |
61 | </div> |
|
61 | </div> | |
62 | </div> |
|
62 | </div> | |
63 | <div class="author"> |
|
63 | <div class="author"> | |
64 | <div class="gravatar"> |
|
64 | <div class="gravatar"> | |
65 | <img alt="gravatar" src="${h.gravatar_url(h.email(c.cs.author),16)}"/> |
|
65 | <img alt="gravatar" src="${h.gravatar_url(h.email(c.cs.author),16)}"/> | |
66 | </div> |
|
66 | </div> | |
67 | <div title="${c.cs.author}" class="user">${h.person(c.cs.author)}</div> |
|
67 | <div title="${c.cs.author}" class="user">${h.person(c.cs.author)}</div> | |
68 | </div> |
|
68 | </div> | |
69 | <div class="commit">${c.file.last_changeset.message}</div> |
|
69 | <div class="commit">${c.file.last_changeset.message}</div> | |
70 | </div> |
|
70 | </div> | |
71 | <div class="code-body"> |
|
71 | <div class="code-body"> | |
72 | %if c.file.is_binary: |
|
72 | %if c.file.is_binary: | |
73 | ${_('Binary file (%s)') % c.file.mimetype} |
|
73 | ${_('Binary file (%s)') % c.file.mimetype} | |
74 | %else: |
|
74 | %else: | |
75 | % if c.file.size < c.cut_off_limit: |
|
75 | % if c.file.size < c.cut_off_limit: | |
76 | ${h.pygmentize_annotation(c.repo_name,c.file,linenos=True,anchorlinenos=True,lineanchors='L',cssclass="code-highlight")} |
|
76 | ${h.pygmentize_annotation(c.repo_name,c.file,linenos=True,anchorlinenos=True,lineanchors='L',cssclass="code-highlight")} | |
77 | %else: |
|
77 | %else: | |
78 | ${_('File is too big to display')} ${h.link_to(_('show as raw'), |
|
78 | ${_('File is too big to display')} ${h.link_to(_('show as raw'), | |
79 | h.url('files_raw_home',repo_name=c.repo_name,revision=c.cs.revision,f_path=c.f_path))} |
|
79 | h.url('files_raw_home',repo_name=c.repo_name,revision=c.cs.revision,f_path=c.f_path))} | |
80 | %endif |
|
80 | %endif | |
81 | <script type="text/javascript"> |
|
81 | <script type="text/javascript"> | |
82 | function highlight_lines(lines){ |
|
82 | function highlight_lines(lines){ | |
83 | for(pos in lines){ |
|
83 | for(pos in lines){ | |
84 | YUD.setStyle('L'+lines[pos],'background-color','#FFFFBE'); |
|
84 | YUD.setStyle('L'+lines[pos],'background-color','#FFFFBE'); | |
85 | } |
|
85 | } | |
86 | } |
|
86 | } | |
87 | page_highlights = location.href.substring(location.href.indexOf('#')+1).split('L'); |
|
87 | page_highlights = location.href.substring(location.href.indexOf('#')+1).split('L'); | |
88 | if (page_highlights.length == 2){ |
|
88 | if (page_highlights.length == 2){ | |
89 | highlight_ranges = page_highlights[1].split(","); |
|
89 | highlight_ranges = page_highlights[1].split(","); | |
90 |
|
90 | |||
91 | var h_lines = []; |
|
91 | var h_lines = []; | |
92 | for (pos in highlight_ranges){ |
|
92 | for (pos in highlight_ranges){ | |
93 | var _range = highlight_ranges[pos].split('-'); |
|
93 | var _range = highlight_ranges[pos].split('-'); | |
94 | if(_range.length == 2){ |
|
94 | if(_range.length == 2){ | |
95 | var start = parseInt(_range[0]); |
|
95 | var start = parseInt(_range[0]); | |
96 | var end = parseInt(_range[1]); |
|
96 | var end = parseInt(_range[1]); | |
97 | if (start < end){ |
|
97 | if (start < end){ | |
98 | for(var i=start;i<=end;i++){ |
|
98 | for(var i=start;i<=end;i++){ | |
99 | h_lines.push(i); |
|
99 | h_lines.push(i); | |
100 | } |
|
100 | } | |
101 | } |
|
101 | } | |
102 | } |
|
102 | } | |
103 | else{ |
|
103 | else{ | |
104 | h_lines.push(parseInt(highlight_ranges[pos])); |
|
104 | h_lines.push(parseInt(highlight_ranges[pos])); | |
105 | } |
|
105 | } | |
106 | } |
|
106 | } | |
107 | highlight_lines(h_lines); |
|
107 | highlight_lines(h_lines); | |
108 |
|
108 | |||
109 | //remember original location |
|
109 | //remember original location | |
110 | var old_hash = location.href.substring(location.href.indexOf('#')); |
|
110 | var old_hash = location.href.substring(location.href.indexOf('#')); | |
111 |
|
111 | |||
112 | // this makes a jump to anchor moved by 3 posstions for padding |
|
112 | // this makes a jump to anchor moved by 3 posstions for padding | |
113 | window.location.hash = '#L'+Math.max(parseInt(h_lines[0])-3,1); |
|
113 | window.location.hash = '#L'+Math.max(parseInt(h_lines[0])-3,1); | |
114 |
|
114 | |||
115 | //sets old anchor |
|
115 | //sets old anchor | |
116 | window.location.hash = old_hash; |
|
116 | window.location.hash = old_hash; | |
117 |
|
117 | |||
118 | } |
|
118 | } | |
119 | </script> |
|
119 | </script> | |
120 | %endif |
|
120 | %endif | |
121 | </div> |
|
121 | </div> | |
122 | </div> |
|
122 | </div> | |
123 | <script type="text/javascript"> |
|
123 | <script type="text/javascript"> | |
124 | YAHOO.util.Event.onDOMReady(function(){ |
|
124 | YAHOO.util.Event.onDOMReady(function(){ | |
125 | YUE.on('show_rev','click',function(e){ |
|
125 | YUE.on('show_rev','click',function(e){ | |
126 | YAHOO.util.Event.preventDefault(e); |
|
126 | YAHOO.util.Event.preventDefault(e); | |
127 | var cs = YAHOO.util.Dom.get('diff1').value; |
|
127 | var cs = YAHOO.util.Dom.get('diff1').value; | |
128 | var url = "${h.url('files_annotate_home',repo_name=c.repo_name,revision='__CS__',f_path=c.f_path)}".replace('__CS__',cs); |
|
128 | var url = "${h.url('files_annotate_home',repo_name=c.repo_name,revision='__CS__',f_path=c.f_path)}".replace('__CS__',cs); | |
129 | window.location = url; |
|
129 | window.location = url; | |
130 | }); |
|
130 | }); | |
131 | }); |
|
131 | }); | |
132 | </script> |
|
132 | </script> | |
133 | </div> |
|
133 | </div> | |
134 | </div> |
|
134 | </div> | |
135 | </div> |
|
135 | </div> | |
136 | </%def> |
|
136 | </%def> |
@@ -1,112 +1,112 b'' | |||||
1 | <%def name="file_class(node)"> |
|
1 | <%def name="file_class(node)"> | |
2 | %if node.is_file(): |
|
2 | %if node.is_file(): | |
3 | <%return "browser-file" %> |
|
3 | <%return "browser-file" %> | |
4 | %else: |
|
4 | %else: | |
5 | <%return "browser-dir"%> |
|
5 | <%return "browser-dir"%> | |
6 | %endif |
|
6 | %endif | |
7 | </%def> |
|
7 | </%def> | |
8 | <div id="body" class="browserblock"> |
|
8 | <div id="body" class="browserblock"> | |
9 | <div class="browser-header"> |
|
9 | <div class="browser-header"> | |
10 | <div class="browser-nav"> |
|
10 | <div class="browser-nav"> | |
11 | ${h.form(h.url.current())} |
|
11 | ${h.form(h.url.current())} | |
12 | <div class="info_box"> |
|
12 | <div class="info_box"> | |
13 | <span class="rev">${_('view')}@rev</span> |
|
13 | <span class="rev">${_('view')}@rev</span> | |
14 | <a class="ui-btn" href="${c.url_prev}" title="${_('previous revision')}">«</a> |
|
14 | <a class="ui-btn" href="${c.url_prev}" title="${_('previous revision')}">«</a> | |
15 | ${h.text('at_rev',value=c.changeset.revision,size=5)} |
|
15 | ${h.text('at_rev',value=c.changeset.revision,size=5)} | |
16 | <a class="ui-btn" href="${c.url_next}" title="${_('next revision')}">»</a> |
|
16 | <a class="ui-btn" href="${c.url_next}" title="${_('next revision')}">»</a> | |
17 | ## ${h.submit('view',_('view'),class_="ui-btn")} |
|
17 | ## ${h.submit('view',_('view'),class_="ui-btn")} | |
18 | </div> |
|
18 | </div> | |
19 | ${h.end_form()} |
|
19 | ${h.end_form()} | |
20 | </div> |
|
20 | </div> | |
21 | <div class="browser-branch"> |
|
21 | <div class="browser-branch"> | |
22 | ${h.checkbox('stay_at_branch',c.changeset.branch,c.changeset.branch==c.branch)} |
|
22 | ${h.checkbox('stay_at_branch',c.changeset.branch,c.changeset.branch==c.branch)} | |
23 | <label>${_('follow current branch')}</label> |
|
23 | <label>${_('follow current branch')}</label> | |
24 | </div> |
|
24 | </div> | |
25 | <div class="browser-search"> |
|
25 | <div class="browser-search"> | |
26 | <div id="search_activate_id" class="search_activate"> |
|
26 | <div id="search_activate_id" class="search_activate"> | |
27 | <a class="ui-btn" id="filter_activate" href="#">${_('search file list')}</a> |
|
27 | <a class="ui-btn" id="filter_activate" href="#">${_('search file list')}</a> | |
28 | </div> |
|
28 | </div> | |
29 | % if h.HasRepoPermissionAny('repository.write','repository.admin')(c.repo_name): |
|
29 | % if h.HasRepoPermissionAny('repository.write','repository.admin')(c.repo_name): | |
30 | <div id="add_node_id" class="add_node"> |
|
30 | <div id="add_node_id" class="add_node"> | |
31 | <a class="ui-btn" href="${h.url('files_add_home',repo_name=c.repo_name,revision=c.changeset.raw_id,f_path=c.f_path)}">${_('add new file')}</a> |
|
31 | <a class="ui-btn" href="${h.url('files_add_home',repo_name=c.repo_name,revision=c.changeset.raw_id,f_path=c.f_path)}">${_('add new file')}</a> | |
32 | </div> |
|
32 | </div> | |
33 | % endif |
|
33 | % endif | |
34 | <div> |
|
34 | <div> | |
35 | <div id="node_filter_box_loading" style="display:none">${_('Loading file list...')}</div> |
|
35 | <div id="node_filter_box_loading" style="display:none">${_('Loading file list...')}</div> | |
36 | <div id="node_filter_box" style="display:none"> |
|
36 | <div id="node_filter_box" style="display:none"> | |
37 | ${h.files_breadcrumbs(c.repo_name,c.changeset.raw_id,c.file.path)}/<input class="init" type="text" value="type to search..." name="filter" size="25" id="node_filter" autocomplete="off"> |
|
37 | ${h.files_breadcrumbs(c.repo_name,c.changeset.raw_id,c.file.path)}/<input class="init" type="text" value="type to search..." name="filter" size="25" id="node_filter" autocomplete="off"> | |
38 | </div> |
|
38 | </div> | |
39 | </div> |
|
39 | </div> | |
40 | </div> |
|
40 | </div> | |
41 | </div> |
|
41 | </div> | |
42 |
|
42 | |||
43 | <div class="browser-body"> |
|
43 | <div class="browser-body"> | |
44 | <table class="code-browser"> |
|
44 | <table class="code-browser"> | |
45 | <thead> |
|
45 | <thead> | |
46 | <tr> |
|
46 | <tr> | |
47 | <th>${_('Name')}</th> |
|
47 | <th>${_('Name')}</th> | |
48 | <th>${_('Size')}</th> |
|
48 | <th>${_('Size')}</th> | |
49 | <th>${_('Mimetype')}</th> |
|
49 | <th>${_('Mimetype')}</th> | |
50 | <th>${_('Revision')}</th> |
|
50 | <th>${_('Last Revision')}</th> | |
51 | <th>${_('Last modified')}</th> |
|
51 | <th>${_('Last modified')}</th> | |
52 | <th>${_('Last commiter')}</th> |
|
52 | <th>${_('Last commiter')}</th> | |
53 | </tr> |
|
53 | </tr> | |
54 | </thead> |
|
54 | </thead> | |
55 |
|
55 | |||
56 | <tbody id="tbody"> |
|
56 | <tbody id="tbody"> | |
57 | %if c.file.parent: |
|
57 | %if c.file.parent: | |
58 | <tr class="parity0"> |
|
58 | <tr class="parity0"> | |
59 | <td> |
|
59 | <td> | |
60 | ${h.link_to('..',h.url('files_home',repo_name=c.repo_name,revision=c.changeset.raw_id,f_path=c.file.parent.path),class_="browser-dir ypjax-link")} |
|
60 | ${h.link_to('..',h.url('files_home',repo_name=c.repo_name,revision=c.changeset.raw_id,f_path=c.file.parent.path),class_="browser-dir ypjax-link")} | |
61 | </td> |
|
61 | </td> | |
62 | <td></td> |
|
62 | <td></td> | |
63 | <td></td> |
|
63 | <td></td> | |
64 | <td></td> |
|
64 | <td></td> | |
65 | <td></td> |
|
65 | <td></td> | |
66 | <td></td> |
|
66 | <td></td> | |
67 | </tr> |
|
67 | </tr> | |
68 | %endif |
|
68 | %endif | |
69 |
|
69 | |||
70 | %for cnt,node in enumerate(c.file): |
|
70 | %for cnt,node in enumerate(c.file): | |
71 | <tr class="parity${cnt%2}"> |
|
71 | <tr class="parity${cnt%2}"> | |
72 | <td> |
|
72 | <td> | |
73 |
|
|
73 | ${h.link_to(node.name,h.url('files_home',repo_name=c.repo_name,revision=c.changeset.raw_id,f_path=h.safe_unicode(node.path)),class_=file_class(node)+" ypjax-link")} | |
74 | </td> |
|
74 | </td> | |
75 | <td> |
|
75 | <td> | |
76 | %if node.is_file(): |
|
76 | %if node.is_file(): | |
77 | ${h.format_byte_size(node.size,binary=True)} |
|
77 | ${h.format_byte_size(node.size,binary=True)} | |
78 | %endif |
|
78 | %endif | |
79 | </td> |
|
79 | </td> | |
80 | <td> |
|
80 | <td> | |
81 | %if node.is_file(): |
|
81 | %if node.is_file(): | |
82 | ${node.mimetype} |
|
82 | ${node.mimetype} | |
83 | %endif |
|
83 | %endif | |
84 | </td> |
|
84 | </td> | |
85 | <td> |
|
85 | <td> | |
86 | %if node.is_file(): |
|
86 | %if node.is_file(): | |
87 | <div class="tooltip" title="${node.last_changeset.message}"> |
|
87 | <div class="tooltip" title="${node.last_changeset.message}"> | |
88 | <pre>${'r%s:%s' % (node.last_changeset.revision,node.last_changeset.short_id)}</pre> |
|
88 | <pre>${'r%s:%s' % (node.last_changeset.revision,node.last_changeset.short_id)}</pre> | |
89 | </div> |
|
89 | </div> | |
90 | %endif |
|
90 | %endif | |
91 | </td> |
|
91 | </td> | |
92 | <td> |
|
92 | <td> | |
93 | %if node.is_file(): |
|
93 | %if node.is_file(): | |
94 | <span class="tooltip" title="${node.last_changeset.date}"> |
|
94 | <span class="tooltip" title="${node.last_changeset.date}"> | |
95 | ${h.age(node.last_changeset.date)}</span> |
|
95 | ${h.age(node.last_changeset.date)}</span> | |
96 | %endif |
|
96 | %endif | |
97 | </td> |
|
97 | </td> | |
98 | <td> |
|
98 | <td> | |
99 | %if node.is_file(): |
|
99 | %if node.is_file(): | |
100 | <span title="${node.last_changeset.author}"> |
|
100 | <span title="${node.last_changeset.author}"> | |
101 | ${h.person(node.last_changeset.author)} |
|
101 | ${h.person(node.last_changeset.author)} | |
102 | </span> |
|
102 | </span> | |
103 | %endif |
|
103 | %endif | |
104 | </td> |
|
104 | </td> | |
105 | </tr> |
|
105 | </tr> | |
106 | %endfor |
|
106 | %endfor | |
107 | </tbody> |
|
107 | </tbody> | |
108 | <tbody id="tbody_filtered" style="display:none"> |
|
108 | <tbody id="tbody_filtered" style="display:none"> | |
109 | </tbody> |
|
109 | </tbody> | |
110 | </table> |
|
110 | </table> | |
111 | </div> |
|
111 | </div> | |
112 | </div> |
|
112 | </div> |
@@ -1,78 +1,78 b'' | |||||
1 | <%inherit file="/base/base.html"/> |
|
1 | <%inherit file="/base/base.html"/> | |
2 |
|
2 | |||
3 | <%def name="title()"> |
|
3 | <%def name="title()"> | |
4 | ${c.repo_name} ${_('Edit file')} - ${c.rhodecode_name} |
|
4 | ${c.repo_name} ${_('Edit file')} - ${c.rhodecode_name} | |
5 | </%def> |
|
5 | </%def> | |
6 |
|
6 | |||
7 | <%def name="js_extra()"> |
|
7 | <%def name="js_extra()"> | |
8 | <script type="text/javascript" src="${h.url('/js/codemirror.js')}"></script> |
|
8 | <script type="text/javascript" src="${h.url('/js/codemirror.js')}"></script> | |
9 | </%def> |
|
9 | </%def> | |
10 | <%def name="css_extra()"> |
|
10 | <%def name="css_extra()"> | |
11 | <link rel="stylesheet" type="text/css" href="${h.url('/css/codemirror.css')}"/> |
|
11 | <link rel="stylesheet" type="text/css" href="${h.url('/css/codemirror.css')}"/> | |
12 | </%def> |
|
12 | </%def> | |
13 |
|
13 | |||
14 | <%def name="breadcrumbs_links()"> |
|
14 | <%def name="breadcrumbs_links()"> | |
15 | ${h.link_to(u'Home',h.url('/'))} |
|
15 | ${h.link_to(u'Home',h.url('/'))} | |
16 | » |
|
16 | » | |
17 | ${h.link_to(c.repo_name,h.url('summary_home',repo_name=c.repo_name))} |
|
17 | ${h.link_to(c.repo_name,h.url('summary_home',repo_name=c.repo_name))} | |
18 | » |
|
18 | » | |
19 | ${_('edit file')} @ R${c.cs.revision}:${h.short_id(c.cs.raw_id)} |
|
19 | ${_('edit file')} @ R${c.cs.revision}:${h.short_id(c.cs.raw_id)} | |
20 | </%def> |
|
20 | </%def> | |
21 |
|
21 | |||
22 | <%def name="page_nav()"> |
|
22 | <%def name="page_nav()"> | |
23 | ${self.menu('files')} |
|
23 | ${self.menu('files')} | |
24 | </%def> |
|
24 | </%def> | |
25 | <%def name="main()"> |
|
25 | <%def name="main()"> | |
26 | <div class="box"> |
|
26 | <div class="box"> | |
27 | <!-- box / title --> |
|
27 | <!-- box / title --> | |
28 | <div class="title"> |
|
28 | <div class="title"> | |
29 | ${self.breadcrumbs()} |
|
29 | ${self.breadcrumbs()} | |
30 | <ul class="links"> |
|
30 | <ul class="links"> | |
31 | <li> |
|
31 | <li> | |
32 | <span style="text-transform: uppercase;"> |
|
32 | <span style="text-transform: uppercase;"> | |
33 | <a href="#">${_('branch')}: ${c.cs.branch}</a></span> |
|
33 | <a href="#">${_('branch')}: ${c.cs.branch}</a></span> | |
34 | </li> |
|
34 | </li> | |
35 | </ul> |
|
35 | </ul> | |
36 | </div> |
|
36 | </div> | |
37 | <div class="table"> |
|
37 | <div class="table"> | |
38 | <div id="files_data"> |
|
38 | <div id="files_data"> | |
39 | <h3 class="files_location">${_('Location')}: ${h.files_breadcrumbs(c.repo_name,c.cs.revision,c.file.path)}</h3> |
|
39 | <h3 class="files_location">${_('Location')}: ${h.files_breadcrumbs(c.repo_name,c.cs.revision,c.file.path)}</h3> | |
40 | ${h.form(h.url.current(),method='post',id='eform')} |
|
40 | ${h.form(h.url.current(),method='post',id='eform')} | |
41 | <div id="body" class="codeblock"> |
|
41 | <div id="body" class="codeblock"> | |
42 | <div class="code-header"> |
|
42 | <div class="code-header"> | |
43 | <div class="stats"> |
|
43 | <div class="stats"> | |
44 | <div class="left"><img src="${h.url('/images/icons/file.png')}"/></div> |
|
44 | <div class="left"><img src="${h.url('/images/icons/file.png')}"/></div> | |
45 |
<div class="left item">${h.link_to("r%s:%s" % (c.file. |
|
45 | <div class="left item">${h.link_to("r%s:%s" % (c.file.changeset.revision,h.short_id(c.file.changeset.raw_id)),h.url('changeset_home',repo_name=c.repo_name,revision=c.file.changeset.raw_id))}</div> | |
46 | <div class="left item">${h.format_byte_size(c.file.size,binary=True)}</div> |
|
46 | <div class="left item">${h.format_byte_size(c.file.size,binary=True)}</div> | |
47 | <div class="left item last">${c.file.mimetype}</div> |
|
47 | <div class="left item last">${c.file.mimetype}</div> | |
48 | <div class="buttons"> |
|
48 | <div class="buttons"> | |
49 | ${h.link_to(_('show annotation'),h.url('files_annotate_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.f_path),class_="ui-btn")} |
|
49 | ${h.link_to(_('show annotation'),h.url('files_annotate_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.f_path),class_="ui-btn")} | |
50 | ${h.link_to(_('show as raw'),h.url('files_raw_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.f_path),class_="ui-btn")} |
|
50 | ${h.link_to(_('show as raw'),h.url('files_raw_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.f_path),class_="ui-btn")} | |
51 | ${h.link_to(_('download as raw'),h.url('files_rawfile_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.f_path),class_="ui-btn")} |
|
51 | ${h.link_to(_('download as raw'),h.url('files_rawfile_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.f_path),class_="ui-btn")} | |
52 | % if h.HasRepoPermissionAny('repository.write','repository.admin')(c.repo_name): |
|
52 | % if h.HasRepoPermissionAny('repository.write','repository.admin')(c.repo_name): | |
53 | % if not c.file.is_binary: |
|
53 | % if not c.file.is_binary: | |
54 | ${h.link_to(_('source'),h.url('files_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.f_path),class_="ui-btn")} |
|
54 | ${h.link_to(_('source'),h.url('files_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.f_path),class_="ui-btn")} | |
55 | % endif |
|
55 | % endif | |
56 | % endif |
|
56 | % endif | |
57 | </div> |
|
57 | </div> | |
58 | </div> |
|
58 | </div> | |
59 | <div class="commit">${_('Editing file')}: ${c.file.path}</div> |
|
59 | <div class="commit">${_('Editing file')}: ${c.file.path}</div> | |
60 | </div> |
|
60 | </div> | |
61 | <pre id="editor_pre"></pre> |
|
61 | <pre id="editor_pre"></pre> | |
62 | <textarea id="editor" name="content" style="display:none">${h.escape(c.file.content)|n}</textarea> |
|
62 | <textarea id="editor" name="content" style="display:none">${h.escape(c.file.content)|n}</textarea> | |
63 | <div style="padding: 10px;color:#666666">${_('commit message')}</div> |
|
63 | <div style="padding: 10px;color:#666666">${_('commit message')}</div> | |
64 | <textarea id="commit" name="message" style="height: 60px;width: 99%;margin-left:4px"></textarea> |
|
64 | <textarea id="commit" name="message" style="height: 60px;width: 99%;margin-left:4px"></textarea> | |
65 | </div> |
|
65 | </div> | |
66 | <div style="text-align: left;padding-top: 5px"> |
|
66 | <div style="text-align: left;padding-top: 5px"> | |
67 | ${h.submit('commit',_('Commit changes'),class_="ui-btn")} |
|
67 | ${h.submit('commit',_('Commit changes'),class_="ui-btn")} | |
68 | ${h.reset('reset',_('Reset'),class_="ui-btn")} |
|
68 | ${h.reset('reset',_('Reset'),class_="ui-btn")} | |
69 | </div> |
|
69 | </div> | |
70 | ${h.end_form()} |
|
70 | ${h.end_form()} | |
71 | <script type="text/javascript"> |
|
71 | <script type="text/javascript"> | |
72 | var reset_url = "${h.url('files_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.file.path)}"; |
|
72 | var reset_url = "${h.url('files_home',repo_name=c.repo_name,revision=c.cs.raw_id,f_path=c.file.path)}"; | |
73 | initCodeMirror('editor',reset_url); |
|
73 | initCodeMirror('editor',reset_url); | |
74 | </script> |
|
74 | </script> | |
75 | </div> |
|
75 | </div> | |
76 | </div> |
|
76 | </div> | |
77 | </div> |
|
77 | </div> | |
78 | </%def> |
|
78 | </%def> |
@@ -1,104 +1,104 b'' | |||||
1 | <dl> |
|
1 | <dl> | |
2 | <dt style="padding-top:10px;font-size:16px">${_('History')}</dt> |
|
2 | <dt style="padding-top:10px;font-size:16px">${_('History')}</dt> | |
3 | <dd> |
|
3 | <dd> | |
4 | <div> |
|
4 | <div> | |
5 | ${h.form(h.url('files_diff_home',repo_name=c.repo_name,f_path=c.f_path),method='get')} |
|
5 | ${h.form(h.url('files_diff_home',repo_name=c.repo_name,f_path=c.f_path),method='get')} | |
6 |
${h.hidden('diff2',c.file. |
|
6 | ${h.hidden('diff2',c.file.changeset.raw_id)} | |
7 |
${h.select('diff1',c.file. |
|
7 | ${h.select('diff1',c.file.changeset.raw_id,c.file_history)} | |
8 | ${h.submit('diff','diff to revision',class_="ui-btn")} |
|
8 | ${h.submit('diff','diff to revision',class_="ui-btn")} | |
9 | ${h.submit('show_rev','show at revision',class_="ui-btn")} |
|
9 | ${h.submit('show_rev','show at revision',class_="ui-btn")} | |
10 | ${h.end_form()} |
|
10 | ${h.end_form()} | |
11 | </div> |
|
11 | </div> | |
12 | </dd> |
|
12 | </dd> | |
13 | </dl> |
|
13 | </dl> | |
14 |
|
14 | |||
15 | <div id="body" class="codeblock"> |
|
15 | <div id="body" class="codeblock"> | |
16 | <div class="code-header"> |
|
16 | <div class="code-header"> | |
17 | <div class="stats"> |
|
17 | <div class="stats"> | |
18 | <div class="left img"><img src="${h.url('/images/icons/file.png')}"/></div> |
|
18 | <div class="left img"><img src="${h.url('/images/icons/file.png')}"/></div> | |
19 |
<div class="left item"><pre>${h.link_to("r%s:%s" % (c.file. |
|
19 | <div class="left item"><pre>${h.link_to("r%s:%s" % (c.file.changeset.revision,h.short_id(c.file.changeset.raw_id)),h.url('changeset_home',repo_name=c.repo_name,revision=c.file.changeset.raw_id))}</pre></div> | |
20 | <div class="left item"><pre>${h.format_byte_size(c.file.size,binary=True)}</pre></div> |
|
20 | <div class="left item"><pre>${h.format_byte_size(c.file.size,binary=True)}</pre></div> | |
21 | <div class="left item last"><pre>${c.file.mimetype}</pre></div> |
|
21 | <div class="left item last"><pre>${c.file.mimetype}</pre></div> | |
22 | <div class="buttons"> |
|
22 | <div class="buttons"> | |
23 |
${h.link_to(_('show annotation'),h.url('files_annotate_home',repo_name=c.repo_name,revision=c.file. |
|
23 | ${h.link_to(_('show annotation'),h.url('files_annotate_home',repo_name=c.repo_name,revision=c.file.changeset.raw_id,f_path=c.f_path),class_="ui-btn")} | |
24 |
${h.link_to(_('show as raw'),h.url('files_raw_home',repo_name=c.repo_name,revision=c.file. |
|
24 | ${h.link_to(_('show as raw'),h.url('files_raw_home',repo_name=c.repo_name,revision=c.file.changeset.raw_id,f_path=c.f_path),class_="ui-btn")} | |
25 |
${h.link_to(_('download as raw'),h.url('files_rawfile_home',repo_name=c.repo_name,revision=c.file. |
|
25 | ${h.link_to(_('download as raw'),h.url('files_rawfile_home',repo_name=c.repo_name,revision=c.file.changeset.raw_id,f_path=c.f_path),class_="ui-btn")} | |
26 | % if h.HasRepoPermissionAny('repository.write','repository.admin')(c.repo_name): |
|
26 | % if h.HasRepoPermissionAny('repository.write','repository.admin')(c.repo_name): | |
27 | % if not c.file.is_binary: |
|
27 | % if not c.file.is_binary: | |
28 |
${h.link_to(_('edit'),h.url('files_edit_home',repo_name=c.repo_name,revision=c.file. |
|
28 | ${h.link_to(_('edit'),h.url('files_edit_home',repo_name=c.repo_name,revision=c.file.changeset.raw_id,f_path=c.f_path),class_="ui-btn")} | |
29 | % endif |
|
29 | % endif | |
30 | % endif |
|
30 | % endif | |
31 | </div> |
|
31 | </div> | |
32 | </div> |
|
32 | </div> | |
33 | <div class="author"> |
|
33 | <div class="author"> | |
34 | <div class="gravatar"> |
|
34 | <div class="gravatar"> | |
35 |
<img alt="gravatar" src="${h.gravatar_url(h.email(c.file. |
|
35 | <img alt="gravatar" src="${h.gravatar_url(h.email(c.file.changeset.author),16)}"/> | |
36 | </div> |
|
36 | </div> | |
37 |
<div title="${c.file. |
|
37 | <div title="${c.file.changeset.author}" class="user">${h.person(c.file.changeset.author)}</div> | |
38 | </div> |
|
38 | </div> | |
39 |
<div class="commit">${h.urlify_commit(c.file. |
|
39 | <div class="commit">${h.urlify_commit(c.file.changeset.message,c.repo_name)}</div> | |
40 | </div> |
|
40 | </div> | |
41 | <div class="code-body"> |
|
41 | <div class="code-body"> | |
42 | %if c.file.is_binary: |
|
42 | %if c.file.is_binary: | |
43 | ${_('Binary file (%s)') % c.file.mimetype} |
|
43 | ${_('Binary file (%s)') % c.file.mimetype} | |
44 | %else: |
|
44 | %else: | |
45 | % if c.file.size < c.cut_off_limit: |
|
45 | % if c.file.size < c.cut_off_limit: | |
46 | ${h.pygmentize(c.file,linenos=True,anchorlinenos=True,lineanchors='L',cssclass="code-highlight")} |
|
46 | ${h.pygmentize(c.file,linenos=True,anchorlinenos=True,lineanchors='L',cssclass="code-highlight")} | |
47 | %else: |
|
47 | %else: | |
48 | ${_('File is too big to display')} ${h.link_to(_('show as raw'), |
|
48 | ${_('File is too big to display')} ${h.link_to(_('show as raw'), | |
49 |
h.url('files_raw_home',repo_name=c.repo_name,revision=c.file. |
|
49 | h.url('files_raw_home',repo_name=c.repo_name,revision=c.file.changeset.raw_id,f_path=c.f_path))} | |
50 | %endif |
|
50 | %endif | |
51 | <script type="text/javascript"> |
|
51 | <script type="text/javascript"> | |
52 | function highlight_lines(lines){ |
|
52 | function highlight_lines(lines){ | |
53 | for(pos in lines){ |
|
53 | for(pos in lines){ | |
54 | YUD.setStyle('L'+lines[pos],'background-color','#FFFFBE'); |
|
54 | YUD.setStyle('L'+lines[pos],'background-color','#FFFFBE'); | |
55 | } |
|
55 | } | |
56 | } |
|
56 | } | |
57 | page_highlights = location.href.substring(location.href.indexOf('#')+1).split('L'); |
|
57 | page_highlights = location.href.substring(location.href.indexOf('#')+1).split('L'); | |
58 | if (page_highlights.length == 2){ |
|
58 | if (page_highlights.length == 2){ | |
59 | highlight_ranges = page_highlights[1].split(","); |
|
59 | highlight_ranges = page_highlights[1].split(","); | |
60 |
|
60 | |||
61 | var h_lines = []; |
|
61 | var h_lines = []; | |
62 | for (pos in highlight_ranges){ |
|
62 | for (pos in highlight_ranges){ | |
63 | var _range = highlight_ranges[pos].split('-'); |
|
63 | var _range = highlight_ranges[pos].split('-'); | |
64 | if(_range.length == 2){ |
|
64 | if(_range.length == 2){ | |
65 | var start = parseInt(_range[0]); |
|
65 | var start = parseInt(_range[0]); | |
66 | var end = parseInt(_range[1]); |
|
66 | var end = parseInt(_range[1]); | |
67 | if (start < end){ |
|
67 | if (start < end){ | |
68 | for(var i=start;i<=end;i++){ |
|
68 | for(var i=start;i<=end;i++){ | |
69 | h_lines.push(i); |
|
69 | h_lines.push(i); | |
70 | } |
|
70 | } | |
71 | } |
|
71 | } | |
72 | } |
|
72 | } | |
73 | else{ |
|
73 | else{ | |
74 | h_lines.push(parseInt(highlight_ranges[pos])); |
|
74 | h_lines.push(parseInt(highlight_ranges[pos])); | |
75 | } |
|
75 | } | |
76 | } |
|
76 | } | |
77 | highlight_lines(h_lines); |
|
77 | highlight_lines(h_lines); | |
78 |
|
78 | |||
79 | //remember original location |
|
79 | //remember original location | |
80 | var old_hash = location.href.substring(location.href.indexOf('#')); |
|
80 | var old_hash = location.href.substring(location.href.indexOf('#')); | |
81 |
|
81 | |||
82 | // this makes a jump to anchor moved by 3 posstions for padding |
|
82 | // this makes a jump to anchor moved by 3 posstions for padding | |
83 | window.location.hash = '#L'+Math.max(parseInt(h_lines[0])-3,1); |
|
83 | window.location.hash = '#L'+Math.max(parseInt(h_lines[0])-3,1); | |
84 |
|
84 | |||
85 | //sets old anchor |
|
85 | //sets old anchor | |
86 | window.location.hash = old_hash; |
|
86 | window.location.hash = old_hash; | |
87 |
|
87 | |||
88 | } |
|
88 | } | |
89 | </script> |
|
89 | </script> | |
90 | %endif |
|
90 | %endif | |
91 | </div> |
|
91 | </div> | |
92 | </div> |
|
92 | </div> | |
93 |
|
93 | |||
94 | <script type="text/javascript"> |
|
94 | <script type="text/javascript"> | |
95 | YUE.onDOMReady(function(){ |
|
95 | YUE.onDOMReady(function(){ | |
96 | YUE.on('show_rev','click',function(e){ |
|
96 | YUE.on('show_rev','click',function(e){ | |
97 | YUE.preventDefault(e); |
|
97 | YUE.preventDefault(e); | |
98 | var cs = YUD.get('diff1').value; |
|
98 | var cs = YUD.get('diff1').value; | |
99 | var url = "${h.url('files_home',repo_name=c.repo_name,revision='__CS__',f_path=c.f_path)}".replace('__CS__',cs); |
|
99 | var url = "${h.url('files_home',repo_name=c.repo_name,revision='__CS__',f_path=c.f_path)}".replace('__CS__',cs); | |
100 | window.location = url; |
|
100 | window.location = url; | |
101 | }); |
|
101 | }); | |
102 | YUE.on('hlcode','mouseup',getSelectionLink("${_('Selection link')}")) |
|
102 | YUE.on('hlcode','mouseup',getSelectionLink("${_('Selection link')}")) | |
103 | }); |
|
103 | }); | |
104 | </script> |
|
104 | </script> |
@@ -1,318 +1,314 b'' | |||||
1 | from rhodecode.tests import * |
|
1 | from rhodecode.tests import * | |
2 |
|
2 | |||
3 | ARCHIVE_SPECS = { |
|
3 | ARCHIVE_SPECS = { | |
4 | '.tar.bz2': ('application/x-bzip2', 'tbz2', ''), |
|
4 | '.tar.bz2': ('application/x-bzip2', 'tbz2', ''), | |
5 | '.tar.gz': ('application/x-gzip', 'tgz', ''), |
|
5 | '.tar.gz': ('application/x-gzip', 'tgz', ''), | |
6 | '.zip': ('application/zip', 'zip', ''), |
|
6 | '.zip': ('application/zip', 'zip', ''), | |
7 | } |
|
7 | } | |
8 |
|
8 | |||
9 |
|
9 | |||
10 | class TestFilesController(TestController): |
|
10 | class TestFilesController(TestController): | |
11 |
|
11 | |||
12 | def test_index(self): |
|
12 | def test_index(self): | |
13 | self.log_user() |
|
13 | self.log_user() | |
14 | response = self.app.get(url(controller='files', action='index', |
|
14 | response = self.app.get(url(controller='files', action='index', | |
15 | repo_name=HG_REPO, |
|
15 | repo_name=HG_REPO, | |
16 | revision='tip', |
|
16 | revision='tip', | |
17 | f_path='/')) |
|
17 | f_path='/')) | |
18 | # Test response... |
|
18 | # Test response... | |
19 | response.mustcontain('<a class="browser-dir ypjax-link" href="/vcs_test_hg/files/27cd5cce30c96924232dffcd24178a07ffeb5dfc/docs">docs</a>') |
|
19 | response.mustcontain('<a class="browser-dir ypjax-link" href="/vcs_test_hg/files/27cd5cce30c96924232dffcd24178a07ffeb5dfc/docs">docs</a>') | |
20 | response.mustcontain('<a class="browser-dir ypjax-link" href="/vcs_test_hg/files/27cd5cce30c96924232dffcd24178a07ffeb5dfc/tests">tests</a>') |
|
20 | response.mustcontain('<a class="browser-dir ypjax-link" href="/vcs_test_hg/files/27cd5cce30c96924232dffcd24178a07ffeb5dfc/tests">tests</a>') | |
21 | response.mustcontain('<a class="browser-dir ypjax-link" href="/vcs_test_hg/files/27cd5cce30c96924232dffcd24178a07ffeb5dfc/vcs">vcs</a>') |
|
21 | response.mustcontain('<a class="browser-dir ypjax-link" href="/vcs_test_hg/files/27cd5cce30c96924232dffcd24178a07ffeb5dfc/vcs">vcs</a>') | |
22 | response.mustcontain('<a class="browser-file ypjax-link" href="/vcs_test_hg/files/27cd5cce30c96924232dffcd24178a07ffeb5dfc/.hgignore">.hgignore</a>') |
|
22 | response.mustcontain('<a class="browser-file ypjax-link" href="/vcs_test_hg/files/27cd5cce30c96924232dffcd24178a07ffeb5dfc/.hgignore">.hgignore</a>') | |
23 | response.mustcontain('<a class="browser-file ypjax-link" href="/vcs_test_hg/files/27cd5cce30c96924232dffcd24178a07ffeb5dfc/MANIFEST.in">MANIFEST.in</a>') |
|
23 | response.mustcontain('<a class="browser-file ypjax-link" href="/vcs_test_hg/files/27cd5cce30c96924232dffcd24178a07ffeb5dfc/MANIFEST.in">MANIFEST.in</a>') | |
24 |
|
24 | |||
25 | def test_index_revision(self): |
|
25 | def test_index_revision(self): | |
26 | self.log_user() |
|
26 | self.log_user() | |
27 |
|
27 | |||
28 | response = self.app.get( |
|
28 | response = self.app.get( | |
29 | url(controller='files', action='index', |
|
29 | url(controller='files', action='index', | |
30 | repo_name=HG_REPO, |
|
30 | repo_name=HG_REPO, | |
31 | revision='7ba66bec8d6dbba14a2155be32408c435c5f4492', |
|
31 | revision='7ba66bec8d6dbba14a2155be32408c435c5f4492', | |
32 | f_path='/') |
|
32 | f_path='/') | |
33 | ) |
|
33 | ) | |
34 |
|
34 | |||
35 | #Test response... |
|
35 | #Test response... | |
36 |
|
36 | |||
37 | response.mustcontain('<a class="browser-dir ypjax-link" href="/vcs_test_hg/files/7ba66bec8d6dbba14a2155be32408c435c5f4492/docs">docs</a>') |
|
37 | response.mustcontain('<a class="browser-dir ypjax-link" href="/vcs_test_hg/files/7ba66bec8d6dbba14a2155be32408c435c5f4492/docs">docs</a>') | |
38 | response.mustcontain('<a class="browser-dir ypjax-link" href="/vcs_test_hg/files/7ba66bec8d6dbba14a2155be32408c435c5f4492/tests">tests</a>') |
|
38 | response.mustcontain('<a class="browser-dir ypjax-link" href="/vcs_test_hg/files/7ba66bec8d6dbba14a2155be32408c435c5f4492/tests">tests</a>') | |
39 | response.mustcontain('<a class="browser-file ypjax-link" href="/vcs_test_hg/files/7ba66bec8d6dbba14a2155be32408c435c5f4492/README.rst">README.rst</a>') |
|
39 | response.mustcontain('<a class="browser-file ypjax-link" href="/vcs_test_hg/files/7ba66bec8d6dbba14a2155be32408c435c5f4492/README.rst">README.rst</a>') | |
40 | response.mustcontain('1.1 KiB') |
|
40 | response.mustcontain('1.1 KiB') | |
41 | response.mustcontain('text/x-python') |
|
41 | response.mustcontain('text/x-python') | |
42 |
|
42 | |||
43 | def test_index_different_branch(self): |
|
43 | def test_index_different_branch(self): | |
44 | self.log_user() |
|
44 | self.log_user() | |
45 |
|
45 | |||
46 | response = self.app.get(url(controller='files', action='index', |
|
46 | response = self.app.get(url(controller='files', action='index', | |
47 | repo_name=HG_REPO, |
|
47 | repo_name=HG_REPO, | |
48 | revision='97e8b885c04894463c51898e14387d80c30ed1ee', |
|
48 | revision='97e8b885c04894463c51898e14387d80c30ed1ee', | |
49 | f_path='/')) |
|
49 | f_path='/')) | |
50 |
|
50 | |||
51 | response.mustcontain("""<span style="text-transform: uppercase;"><a href="#">branch: git</a></span>""") |
|
51 | response.mustcontain("""<span style="text-transform: uppercase;"><a href="#">branch: git</a></span>""") | |
52 |
|
52 | |||
53 | def test_index_paging(self): |
|
53 | def test_index_paging(self): | |
54 | self.log_user() |
|
54 | self.log_user() | |
55 |
|
55 | |||
56 | for r in [(73, 'a066b25d5df7016b45a41b7e2a78c33b57adc235'), |
|
56 | for r in [(73, 'a066b25d5df7016b45a41b7e2a78c33b57adc235'), | |
57 | (92, 'cc66b61b8455b264a7a8a2d8ddc80fcfc58c221e'), |
|
57 | (92, 'cc66b61b8455b264a7a8a2d8ddc80fcfc58c221e'), | |
58 | (109, '75feb4c33e81186c87eac740cee2447330288412'), |
|
58 | (109, '75feb4c33e81186c87eac740cee2447330288412'), | |
59 | (1, '3d8f361e72ab303da48d799ff1ac40d5ac37c67e'), |
|
59 | (1, '3d8f361e72ab303da48d799ff1ac40d5ac37c67e'), | |
60 | (0, 'b986218ba1c9b0d6a259fac9b050b1724ed8e545')]: |
|
60 | (0, 'b986218ba1c9b0d6a259fac9b050b1724ed8e545')]: | |
61 |
|
61 | |||
62 | response = self.app.get(url(controller='files', action='index', |
|
62 | response = self.app.get(url(controller='files', action='index', | |
63 | repo_name=HG_REPO, |
|
63 | repo_name=HG_REPO, | |
64 | revision=r[1], |
|
64 | revision=r[1], | |
65 | f_path='/')) |
|
65 | f_path='/')) | |
66 |
|
66 | |||
67 | response.mustcontain("""@ r%s:%s""" % (r[0], r[1][:12])) |
|
67 | response.mustcontain("""@ r%s:%s""" % (r[0], r[1][:12])) | |
68 |
|
68 | |||
69 | def test_file_source(self): |
|
69 | def test_file_source(self): | |
70 | self.log_user() |
|
70 | self.log_user() | |
71 | response = self.app.get(url(controller='files', action='index', |
|
71 | response = self.app.get(url(controller='files', action='index', | |
72 | repo_name=HG_REPO, |
|
72 | repo_name=HG_REPO, | |
73 | revision='27cd5cce30c96924232dffcd24178a07ffeb5dfc', |
|
73 | revision='27cd5cce30c96924232dffcd24178a07ffeb5dfc', | |
74 | f_path='vcs/nodes.py')) |
|
74 | f_path='vcs/nodes.py')) | |
75 |
|
75 | |||
76 | #test or history |
|
76 | #test or history | |
77 | response.mustcontain("""<optgroup label="Changesets"> |
|
77 | response.mustcontain("""<optgroup label="Changesets"> | |
78 |
<option |
|
78 | <option value="8911406ad776fdd3d0b9932a2e89677e57405a48">r167:8911406ad776 (default)</option> | |
79 | <option value="aa957ed78c35a1541f508d2ec90e501b0a9e3167">r165:aa957ed78c35 (default)</option> |
|
79 | <option value="aa957ed78c35a1541f508d2ec90e501b0a9e3167">r165:aa957ed78c35 (default)</option> | |
80 | <option value="48e11b73e94c0db33e736eaeea692f990cb0b5f1">r140:48e11b73e94c (default)</option> |
|
80 | <option value="48e11b73e94c0db33e736eaeea692f990cb0b5f1">r140:48e11b73e94c (default)</option> | |
81 | <option value="adf3cbf483298563b968a6c673cd5bde5f7d5eea">r126:adf3cbf48329 (default)</option> |
|
81 | <option value="adf3cbf483298563b968a6c673cd5bde5f7d5eea">r126:adf3cbf48329 (default)</option> | |
82 | <option value="6249fd0fb2cfb1411e764129f598e2cf0de79a6f">r113:6249fd0fb2cf (git)</option> |
|
82 | <option value="6249fd0fb2cfb1411e764129f598e2cf0de79a6f">r113:6249fd0fb2cf (git)</option> | |
83 | <option value="75feb4c33e81186c87eac740cee2447330288412">r109:75feb4c33e81 (default)</option> |
|
83 | <option value="75feb4c33e81186c87eac740cee2447330288412">r109:75feb4c33e81 (default)</option> | |
84 | <option value="9a4dc232ecdc763ef2e98ae2238cfcbba4f6ad8d">r108:9a4dc232ecdc (default)</option> |
|
84 | <option value="9a4dc232ecdc763ef2e98ae2238cfcbba4f6ad8d">r108:9a4dc232ecdc (default)</option> | |
85 | <option value="595cce4efa21fda2f2e4eeb4fe5f2a6befe6fa2d">r107:595cce4efa21 (default)</option> |
|
85 | <option value="595cce4efa21fda2f2e4eeb4fe5f2a6befe6fa2d">r107:595cce4efa21 (default)</option> | |
86 | <option value="4a8bd421fbc2dfbfb70d85a3fe064075ab2c49da">r104:4a8bd421fbc2 (default)</option> |
|
86 | <option value="4a8bd421fbc2dfbfb70d85a3fe064075ab2c49da">r104:4a8bd421fbc2 (default)</option> | |
87 | <option value="57be63fc8f85e65a0106a53187f7316f8c487ffa">r102:57be63fc8f85 (default)</option> |
|
87 | <option value="57be63fc8f85e65a0106a53187f7316f8c487ffa">r102:57be63fc8f85 (default)</option> | |
88 | <option value="5530bd87f7e2e124a64d07cb2654c997682128be">r101:5530bd87f7e2 (git)</option> |
|
88 | <option value="5530bd87f7e2e124a64d07cb2654c997682128be">r101:5530bd87f7e2 (git)</option> | |
89 | <option value="e516008b1c93f142263dc4b7961787cbad654ce1">r99:e516008b1c93 (default)</option> |
|
89 | <option value="e516008b1c93f142263dc4b7961787cbad654ce1">r99:e516008b1c93 (default)</option> | |
90 | <option value="41f43fc74b8b285984554532eb105ac3be5c434f">r93:41f43fc74b8b (default)</option> |
|
90 | <option value="41f43fc74b8b285984554532eb105ac3be5c434f">r93:41f43fc74b8b (default)</option> | |
91 | <option value="cc66b61b8455b264a7a8a2d8ddc80fcfc58c221e">r92:cc66b61b8455 (default)</option> |
|
91 | <option value="cc66b61b8455b264a7a8a2d8ddc80fcfc58c221e">r92:cc66b61b8455 (default)</option> | |
92 | <option value="73ab5b616b3271b0518682fb4988ce421de8099f">r91:73ab5b616b32 (default)</option> |
|
92 | <option value="73ab5b616b3271b0518682fb4988ce421de8099f">r91:73ab5b616b32 (default)</option> | |
93 | <option value="e0da75f308c0f18f98e9ce6257626009fdda2b39">r82:e0da75f308c0 (default)</option> |
|
93 | <option value="e0da75f308c0f18f98e9ce6257626009fdda2b39">r82:e0da75f308c0 (default)</option> | |
94 | <option value="fb2e41e0f0810be4d7103bc2a4c7be16ee3ec611">r81:fb2e41e0f081 (default)</option> |
|
94 | <option value="fb2e41e0f0810be4d7103bc2a4c7be16ee3ec611">r81:fb2e41e0f081 (default)</option> | |
95 | <option value="602ae2f5e7ade70b3b66a58cdd9e3e613dc8a028">r76:602ae2f5e7ad (default)</option> |
|
95 | <option value="602ae2f5e7ade70b3b66a58cdd9e3e613dc8a028">r76:602ae2f5e7ad (default)</option> | |
96 | <option value="a066b25d5df7016b45a41b7e2a78c33b57adc235">r73:a066b25d5df7 (default)</option> |
|
96 | <option value="a066b25d5df7016b45a41b7e2a78c33b57adc235">r73:a066b25d5df7 (default)</option> | |
97 | <option value="637a933c905958ce5151f154147c25c1c7b68832">r61:637a933c9059 (web)</option> |
|
97 | <option value="637a933c905958ce5151f154147c25c1c7b68832">r61:637a933c9059 (web)</option> | |
98 | <option value="0c21004effeb8ce2d2d5b4a8baf6afa8394b6fbc">r60:0c21004effeb (web)</option> |
|
98 | <option value="0c21004effeb8ce2d2d5b4a8baf6afa8394b6fbc">r60:0c21004effeb (web)</option> | |
99 | <option value="a1f39c56d3f1d52d5fb5920370a2a2716cd9a444">r59:a1f39c56d3f1 (web)</option> |
|
99 | <option value="a1f39c56d3f1d52d5fb5920370a2a2716cd9a444">r59:a1f39c56d3f1 (web)</option> | |
100 | <option value="97d32df05c715a3bbf936bf3cc4e32fb77fe1a7f">r58:97d32df05c71 (web)</option> |
|
100 | <option value="97d32df05c715a3bbf936bf3cc4e32fb77fe1a7f">r58:97d32df05c71 (web)</option> | |
101 | <option value="08eaf14517718dccea4b67755a93368341aca919">r57:08eaf1451771 (web)</option> |
|
101 | <option value="08eaf14517718dccea4b67755a93368341aca919">r57:08eaf1451771 (web)</option> | |
102 | <option value="22f71ad265265a53238359c883aa976e725aa07d">r56:22f71ad26526 (web)</option> |
|
102 | <option value="22f71ad265265a53238359c883aa976e725aa07d">r56:22f71ad26526 (web)</option> | |
103 | <option value="97501f02b7b4330924b647755663a2d90a5e638d">r49:97501f02b7b4 (web)</option> |
|
103 | <option value="97501f02b7b4330924b647755663a2d90a5e638d">r49:97501f02b7b4 (web)</option> | |
104 | <option value="86ede6754f2b27309452bb11f997386ae01d0e5a">r47:86ede6754f2b (web)</option> |
|
104 | <option value="86ede6754f2b27309452bb11f997386ae01d0e5a">r47:86ede6754f2b (web)</option> | |
105 | <option value="014c40c0203c423dc19ecf94644f7cac9d4cdce0">r45:014c40c0203c (web)</option> |
|
105 | <option value="014c40c0203c423dc19ecf94644f7cac9d4cdce0">r45:014c40c0203c (web)</option> | |
106 | <option value="ee87846a61c12153b51543bf860e1026c6d3dcba">r30:ee87846a61c1 (default)</option> |
|
106 | <option value="ee87846a61c12153b51543bf860e1026c6d3dcba">r30:ee87846a61c1 (default)</option> | |
107 | <option value="9bb326a04ae5d98d437dece54be04f830cf1edd9">r26:9bb326a04ae5 (default)</option> |
|
107 | <option value="9bb326a04ae5d98d437dece54be04f830cf1edd9">r26:9bb326a04ae5 (default)</option> | |
108 | <option value="536c1a19428381cfea92ac44985304f6a8049569">r24:536c1a194283 (default)</option> |
|
108 | <option value="536c1a19428381cfea92ac44985304f6a8049569">r24:536c1a194283 (default)</option> | |
109 | <option value="dc5d2c0661b61928834a785d3e64a3f80d3aad9c">r8:dc5d2c0661b6 (default)</option> |
|
109 | <option value="dc5d2c0661b61928834a785d3e64a3f80d3aad9c">r8:dc5d2c0661b6 (default)</option> | |
110 | <option value="3803844fdbd3b711175fc3da9bdacfcd6d29a6fb">r7:3803844fdbd3 (default)</option> |
|
110 | <option value="3803844fdbd3b711175fc3da9bdacfcd6d29a6fb">r7:3803844fdbd3 (default)</option> | |
111 | </optgroup> |
|
111 | </optgroup> | |
112 | <optgroup label="Branches"> |
|
112 | <optgroup label="Branches"> | |
113 | <option value="27cd5cce30c96924232dffcd24178a07ffeb5dfc">default</option> |
|
113 | <option selected="selected" value="27cd5cce30c96924232dffcd24178a07ffeb5dfc">default</option> | |
114 | <option value="97e8b885c04894463c51898e14387d80c30ed1ee">git</option> |
|
114 | <option value="97e8b885c04894463c51898e14387d80c30ed1ee">git</option> | |
115 | <option value="2e6a2bf9356ca56df08807f4ad86d480da72a8f4">web</option> |
|
115 | <option value="2e6a2bf9356ca56df08807f4ad86d480da72a8f4">web</option> | |
116 | </optgroup> |
|
116 | </optgroup> | |
117 | <optgroup label="Tags"> |
|
117 | <optgroup label="Tags"> | |
118 | <option value="27cd5cce30c96924232dffcd24178a07ffeb5dfc">tip</option> |
|
118 | <option selected="selected" value="27cd5cce30c96924232dffcd24178a07ffeb5dfc">tip</option> | |
119 | <option value="fd4bdb5e9b2a29b4393a4ac6caef48c17ee1a200">0.1.4</option> |
|
119 | <option value="fd4bdb5e9b2a29b4393a4ac6caef48c17ee1a200">0.1.4</option> | |
120 | <option value="17544fbfcd33ffb439e2b728b5d526b1ef30bfcf">0.1.3</option> |
|
120 | <option value="17544fbfcd33ffb439e2b728b5d526b1ef30bfcf">0.1.3</option> | |
121 | <option value="a7e60bff65d57ac3a1a1ce3b12a70f8a9e8a7720">0.1.2</option> |
|
121 | <option value="a7e60bff65d57ac3a1a1ce3b12a70f8a9e8a7720">0.1.2</option> | |
122 | <option value="eb3a60fc964309c1a318b8dfe26aa2d1586c85ae">0.1.1</option> |
|
122 | <option value="eb3a60fc964309c1a318b8dfe26aa2d1586c85ae">0.1.1</option> | |
123 |
</optgroup> |
|
123 | </optgroup> | |
|
124 | """) | |||
124 |
|
125 | |||
125 | response.mustcontain("""<div class="commit">Partially implemented #16. filecontent/commit message/author/node name are safe_unicode now. |
|
126 | response.mustcontain("""<div class="commit">merge</div>""") | |
126 | In addition some other __str__ are unicode as well |
|
|||
127 | Added test for unicode |
|
|||
128 | Improved test to clone into uniq repository. |
|
|||
129 | removed extra unicode conversion in diff.</div>""") |
|
|||
130 |
|
127 | |||
131 | response.mustcontain("""<span style="text-transform: uppercase;"><a href="#">branch: default</a></span>""") |
|
128 | response.mustcontain("""<span style="text-transform: uppercase;"><a href="#">branch: default</a></span>""") | |
132 |
|
129 | |||
133 | def test_file_annotation(self): |
|
130 | def test_file_annotation(self): | |
134 | self.log_user() |
|
131 | self.log_user() | |
135 | response = self.app.get(url(controller='files', action='annotate', |
|
132 | response = self.app.get(url(controller='files', action='annotate', | |
136 | repo_name=HG_REPO, |
|
133 | repo_name=HG_REPO, | |
137 | revision='27cd5cce30c96924232dffcd24178a07ffeb5dfc', |
|
134 | revision='27cd5cce30c96924232dffcd24178a07ffeb5dfc', | |
138 | f_path='vcs/nodes.py')) |
|
135 | f_path='vcs/nodes.py')) | |
139 |
|
136 | |||
140 |
|
137 | |||
141 | response.mustcontain("""<optgroup label="Changesets"> |
|
138 | response.mustcontain("""<optgroup label="Changesets"> | |
142 |
<option |
|
139 | <option value="8911406ad776fdd3d0b9932a2e89677e57405a48">r167:8911406ad776 (default)</option> | |
143 | <option value="aa957ed78c35a1541f508d2ec90e501b0a9e3167">r165:aa957ed78c35 (default)</option> |
|
140 | <option value="aa957ed78c35a1541f508d2ec90e501b0a9e3167">r165:aa957ed78c35 (default)</option> | |
144 | <option value="48e11b73e94c0db33e736eaeea692f990cb0b5f1">r140:48e11b73e94c (default)</option> |
|
141 | <option value="48e11b73e94c0db33e736eaeea692f990cb0b5f1">r140:48e11b73e94c (default)</option> | |
145 | <option value="adf3cbf483298563b968a6c673cd5bde5f7d5eea">r126:adf3cbf48329 (default)</option> |
|
142 | <option value="adf3cbf483298563b968a6c673cd5bde5f7d5eea">r126:adf3cbf48329 (default)</option> | |
146 | <option value="6249fd0fb2cfb1411e764129f598e2cf0de79a6f">r113:6249fd0fb2cf (git)</option> |
|
143 | <option value="6249fd0fb2cfb1411e764129f598e2cf0de79a6f">r113:6249fd0fb2cf (git)</option> | |
147 | <option value="75feb4c33e81186c87eac740cee2447330288412">r109:75feb4c33e81 (default)</option> |
|
144 | <option value="75feb4c33e81186c87eac740cee2447330288412">r109:75feb4c33e81 (default)</option> | |
148 | <option value="9a4dc232ecdc763ef2e98ae2238cfcbba4f6ad8d">r108:9a4dc232ecdc (default)</option> |
|
145 | <option value="9a4dc232ecdc763ef2e98ae2238cfcbba4f6ad8d">r108:9a4dc232ecdc (default)</option> | |
149 | <option value="595cce4efa21fda2f2e4eeb4fe5f2a6befe6fa2d">r107:595cce4efa21 (default)</option> |
|
146 | <option value="595cce4efa21fda2f2e4eeb4fe5f2a6befe6fa2d">r107:595cce4efa21 (default)</option> | |
150 | <option value="4a8bd421fbc2dfbfb70d85a3fe064075ab2c49da">r104:4a8bd421fbc2 (default)</option> |
|
147 | <option value="4a8bd421fbc2dfbfb70d85a3fe064075ab2c49da">r104:4a8bd421fbc2 (default)</option> | |
151 | <option value="57be63fc8f85e65a0106a53187f7316f8c487ffa">r102:57be63fc8f85 (default)</option> |
|
148 | <option value="57be63fc8f85e65a0106a53187f7316f8c487ffa">r102:57be63fc8f85 (default)</option> | |
152 | <option value="5530bd87f7e2e124a64d07cb2654c997682128be">r101:5530bd87f7e2 (git)</option> |
|
149 | <option value="5530bd87f7e2e124a64d07cb2654c997682128be">r101:5530bd87f7e2 (git)</option> | |
153 | <option value="e516008b1c93f142263dc4b7961787cbad654ce1">r99:e516008b1c93 (default)</option> |
|
150 | <option value="e516008b1c93f142263dc4b7961787cbad654ce1">r99:e516008b1c93 (default)</option> | |
154 | <option value="41f43fc74b8b285984554532eb105ac3be5c434f">r93:41f43fc74b8b (default)</option> |
|
151 | <option value="41f43fc74b8b285984554532eb105ac3be5c434f">r93:41f43fc74b8b (default)</option> | |
155 | <option value="cc66b61b8455b264a7a8a2d8ddc80fcfc58c221e">r92:cc66b61b8455 (default)</option> |
|
152 | <option value="cc66b61b8455b264a7a8a2d8ddc80fcfc58c221e">r92:cc66b61b8455 (default)</option> | |
156 | <option value="73ab5b616b3271b0518682fb4988ce421de8099f">r91:73ab5b616b32 (default)</option> |
|
153 | <option value="73ab5b616b3271b0518682fb4988ce421de8099f">r91:73ab5b616b32 (default)</option> | |
157 | <option value="e0da75f308c0f18f98e9ce6257626009fdda2b39">r82:e0da75f308c0 (default)</option> |
|
154 | <option value="e0da75f308c0f18f98e9ce6257626009fdda2b39">r82:e0da75f308c0 (default)</option> | |
158 | <option value="fb2e41e0f0810be4d7103bc2a4c7be16ee3ec611">r81:fb2e41e0f081 (default)</option> |
|
155 | <option value="fb2e41e0f0810be4d7103bc2a4c7be16ee3ec611">r81:fb2e41e0f081 (default)</option> | |
159 | <option value="602ae2f5e7ade70b3b66a58cdd9e3e613dc8a028">r76:602ae2f5e7ad (default)</option> |
|
156 | <option value="602ae2f5e7ade70b3b66a58cdd9e3e613dc8a028">r76:602ae2f5e7ad (default)</option> | |
160 | <option value="a066b25d5df7016b45a41b7e2a78c33b57adc235">r73:a066b25d5df7 (default)</option> |
|
157 | <option value="a066b25d5df7016b45a41b7e2a78c33b57adc235">r73:a066b25d5df7 (default)</option> | |
161 | <option value="637a933c905958ce5151f154147c25c1c7b68832">r61:637a933c9059 (web)</option> |
|
158 | <option value="637a933c905958ce5151f154147c25c1c7b68832">r61:637a933c9059 (web)</option> | |
162 | <option value="0c21004effeb8ce2d2d5b4a8baf6afa8394b6fbc">r60:0c21004effeb (web)</option> |
|
159 | <option value="0c21004effeb8ce2d2d5b4a8baf6afa8394b6fbc">r60:0c21004effeb (web)</option> | |
163 | <option value="a1f39c56d3f1d52d5fb5920370a2a2716cd9a444">r59:a1f39c56d3f1 (web)</option> |
|
160 | <option value="a1f39c56d3f1d52d5fb5920370a2a2716cd9a444">r59:a1f39c56d3f1 (web)</option> | |
164 | <option value="97d32df05c715a3bbf936bf3cc4e32fb77fe1a7f">r58:97d32df05c71 (web)</option> |
|
161 | <option value="97d32df05c715a3bbf936bf3cc4e32fb77fe1a7f">r58:97d32df05c71 (web)</option> | |
165 | <option value="08eaf14517718dccea4b67755a93368341aca919">r57:08eaf1451771 (web)</option> |
|
162 | <option value="08eaf14517718dccea4b67755a93368341aca919">r57:08eaf1451771 (web)</option> | |
166 | <option value="22f71ad265265a53238359c883aa976e725aa07d">r56:22f71ad26526 (web)</option> |
|
163 | <option value="22f71ad265265a53238359c883aa976e725aa07d">r56:22f71ad26526 (web)</option> | |
167 | <option value="97501f02b7b4330924b647755663a2d90a5e638d">r49:97501f02b7b4 (web)</option> |
|
164 | <option value="97501f02b7b4330924b647755663a2d90a5e638d">r49:97501f02b7b4 (web)</option> | |
168 | <option value="86ede6754f2b27309452bb11f997386ae01d0e5a">r47:86ede6754f2b (web)</option> |
|
165 | <option value="86ede6754f2b27309452bb11f997386ae01d0e5a">r47:86ede6754f2b (web)</option> | |
169 | <option value="014c40c0203c423dc19ecf94644f7cac9d4cdce0">r45:014c40c0203c (web)</option> |
|
166 | <option value="014c40c0203c423dc19ecf94644f7cac9d4cdce0">r45:014c40c0203c (web)</option> | |
170 | <option value="ee87846a61c12153b51543bf860e1026c6d3dcba">r30:ee87846a61c1 (default)</option> |
|
167 | <option value="ee87846a61c12153b51543bf860e1026c6d3dcba">r30:ee87846a61c1 (default)</option> | |
171 | <option value="9bb326a04ae5d98d437dece54be04f830cf1edd9">r26:9bb326a04ae5 (default)</option> |
|
168 | <option value="9bb326a04ae5d98d437dece54be04f830cf1edd9">r26:9bb326a04ae5 (default)</option> | |
172 | <option value="536c1a19428381cfea92ac44985304f6a8049569">r24:536c1a194283 (default)</option> |
|
169 | <option value="536c1a19428381cfea92ac44985304f6a8049569">r24:536c1a194283 (default)</option> | |
173 | <option value="dc5d2c0661b61928834a785d3e64a3f80d3aad9c">r8:dc5d2c0661b6 (default)</option> |
|
170 | <option value="dc5d2c0661b61928834a785d3e64a3f80d3aad9c">r8:dc5d2c0661b6 (default)</option> | |
174 | <option value="3803844fdbd3b711175fc3da9bdacfcd6d29a6fb">r7:3803844fdbd3 (default)</option> |
|
171 | <option value="3803844fdbd3b711175fc3da9bdacfcd6d29a6fb">r7:3803844fdbd3 (default)</option> | |
175 | </optgroup> |
|
172 | </optgroup> | |
176 | <optgroup label="Branches"> |
|
173 | <optgroup label="Branches"> | |
177 | <option value="27cd5cce30c96924232dffcd24178a07ffeb5dfc">default</option> |
|
174 | <option selected="selected" value="27cd5cce30c96924232dffcd24178a07ffeb5dfc">default</option> | |
178 | <option value="97e8b885c04894463c51898e14387d80c30ed1ee">git</option> |
|
175 | <option value="97e8b885c04894463c51898e14387d80c30ed1ee">git</option> | |
179 | <option value="2e6a2bf9356ca56df08807f4ad86d480da72a8f4">web</option> |
|
176 | <option value="2e6a2bf9356ca56df08807f4ad86d480da72a8f4">web</option> | |
180 | </optgroup> |
|
177 | </optgroup> | |
181 | <optgroup label="Tags"> |
|
178 | <optgroup label="Tags"> | |
182 | <option value="27cd5cce30c96924232dffcd24178a07ffeb5dfc">tip</option> |
|
179 | <option selected="selected" value="27cd5cce30c96924232dffcd24178a07ffeb5dfc">tip</option> | |
183 | <option value="fd4bdb5e9b2a29b4393a4ac6caef48c17ee1a200">0.1.4</option> |
|
180 | <option value="fd4bdb5e9b2a29b4393a4ac6caef48c17ee1a200">0.1.4</option> | |
184 | <option value="17544fbfcd33ffb439e2b728b5d526b1ef30bfcf">0.1.3</option> |
|
181 | <option value="17544fbfcd33ffb439e2b728b5d526b1ef30bfcf">0.1.3</option> | |
185 | <option value="a7e60bff65d57ac3a1a1ce3b12a70f8a9e8a7720">0.1.2</option> |
|
182 | <option value="a7e60bff65d57ac3a1a1ce3b12a70f8a9e8a7720">0.1.2</option> | |
186 | <option value="eb3a60fc964309c1a318b8dfe26aa2d1586c85ae">0.1.1</option> |
|
183 | <option value="eb3a60fc964309c1a318b8dfe26aa2d1586c85ae">0.1.1</option> | |
187 | </optgroup> |
|
184 | </optgroup>""") | |
188 | """) |
|
|||
189 |
|
185 | |||
190 | response.mustcontain("""<span style="text-transform: uppercase;"><a href="#">branch: default</a></span>""") |
|
186 | response.mustcontain("""<span style="text-transform: uppercase;"><a href="#">branch: default</a></span>""") | |
191 |
|
187 | |||
192 | def test_archival(self): |
|
188 | def test_archival(self): | |
193 | self.log_user() |
|
189 | self.log_user() | |
194 |
|
190 | |||
195 | for arch_ext, info in ARCHIVE_SPECS.items(): |
|
191 | for arch_ext, info in ARCHIVE_SPECS.items(): | |
196 | fname = '27cd5cce30c96924232dffcd24178a07ffeb5dfc%s' % arch_ext |
|
192 | fname = '27cd5cce30c96924232dffcd24178a07ffeb5dfc%s' % arch_ext | |
197 | filename = '%s-%s' % (HG_REPO, fname) |
|
193 | filename = '%s-%s' % (HG_REPO, fname) | |
198 |
|
194 | |||
199 | response = self.app.get(url(controller='files', action='archivefile', |
|
195 | response = self.app.get(url(controller='files', action='archivefile', | |
200 | repo_name=HG_REPO, |
|
196 | repo_name=HG_REPO, | |
201 | fname=fname)) |
|
197 | fname=fname)) | |
202 |
|
198 | |||
203 | assert response.status == '200 OK', 'wrong response code' |
|
199 | assert response.status == '200 OK', 'wrong response code' | |
204 | assert response.response._headers.items() == [('Pragma', 'no-cache'), |
|
200 | assert response.response._headers.items() == [('Pragma', 'no-cache'), | |
205 | ('Cache-Control', 'no-cache'), |
|
201 | ('Cache-Control', 'no-cache'), | |
206 | ('Content-Type', '%s; charset=utf-8' % info[0]), |
|
202 | ('Content-Type', '%s; charset=utf-8' % info[0]), | |
207 | ('Content-Disposition', 'attachment; filename=%s' % filename), ], 'wrong headers' |
|
203 | ('Content-Disposition', 'attachment; filename=%s' % filename), ], 'wrong headers' | |
208 |
|
204 | |||
209 | def test_archival_wrong_ext(self): |
|
205 | def test_archival_wrong_ext(self): | |
210 | self.log_user() |
|
206 | self.log_user() | |
211 |
|
207 | |||
212 | for arch_ext in ['tar', 'rar', 'x', '..ax', '.zipz']: |
|
208 | for arch_ext in ['tar', 'rar', 'x', '..ax', '.zipz']: | |
213 | fname = '27cd5cce30c96924232dffcd24178a07ffeb5dfc%s' % arch_ext |
|
209 | fname = '27cd5cce30c96924232dffcd24178a07ffeb5dfc%s' % arch_ext | |
214 |
|
210 | |||
215 | response = self.app.get(url(controller='files', action='archivefile', |
|
211 | response = self.app.get(url(controller='files', action='archivefile', | |
216 | repo_name=HG_REPO, |
|
212 | repo_name=HG_REPO, | |
217 | fname=fname)) |
|
213 | fname=fname)) | |
218 | assert 'Unknown archive type' in response.body |
|
214 | assert 'Unknown archive type' in response.body | |
219 |
|
215 | |||
220 |
|
216 | |||
221 | def test_archival_wrong_revision(self): |
|
217 | def test_archival_wrong_revision(self): | |
222 | self.log_user() |
|
218 | self.log_user() | |
223 |
|
219 | |||
224 | for rev in ['00x000000', 'tar', 'wrong', '@##$@$424213232', '232dffcd']: |
|
220 | for rev in ['00x000000', 'tar', 'wrong', '@##$@$424213232', '232dffcd']: | |
225 | fname = '%s.zip' % rev |
|
221 | fname = '%s.zip' % rev | |
226 |
|
222 | |||
227 | response = self.app.get(url(controller='files', action='archivefile', |
|
223 | response = self.app.get(url(controller='files', action='archivefile', | |
228 | repo_name=HG_REPO, |
|
224 | repo_name=HG_REPO, | |
229 | fname=fname)) |
|
225 | fname=fname)) | |
230 | assert 'Unknown revision' in response.body |
|
226 | assert 'Unknown revision' in response.body | |
231 |
|
227 | |||
232 | #========================================================================== |
|
228 | #========================================================================== | |
233 | # RAW FILE |
|
229 | # RAW FILE | |
234 | #========================================================================== |
|
230 | #========================================================================== | |
235 | def test_raw_file_ok(self): |
|
231 | def test_raw_file_ok(self): | |
236 | self.log_user() |
|
232 | self.log_user() | |
237 | response = self.app.get(url(controller='files', action='rawfile', |
|
233 | response = self.app.get(url(controller='files', action='rawfile', | |
238 | repo_name=HG_REPO, |
|
234 | repo_name=HG_REPO, | |
239 | revision='27cd5cce30c96924232dffcd24178a07ffeb5dfc', |
|
235 | revision='27cd5cce30c96924232dffcd24178a07ffeb5dfc', | |
240 | f_path='vcs/nodes.py')) |
|
236 | f_path='vcs/nodes.py')) | |
241 |
|
237 | |||
242 | assert response.content_disposition == "attachment; filename=nodes.py" |
|
238 | assert response.content_disposition == "attachment; filename=nodes.py" | |
243 | assert response.content_type == "text/x-python" |
|
239 | assert response.content_type == "text/x-python" | |
244 |
|
240 | |||
245 | def test_raw_file_wrong_cs(self): |
|
241 | def test_raw_file_wrong_cs(self): | |
246 | self.log_user() |
|
242 | self.log_user() | |
247 | rev = u'ERRORce30c96924232dffcd24178a07ffeb5dfc' |
|
243 | rev = u'ERRORce30c96924232dffcd24178a07ffeb5dfc' | |
248 | f_path = 'vcs/nodes.py' |
|
244 | f_path = 'vcs/nodes.py' | |
249 |
|
245 | |||
250 | response = self.app.get(url(controller='files', action='rawfile', |
|
246 | response = self.app.get(url(controller='files', action='rawfile', | |
251 | repo_name=HG_REPO, |
|
247 | repo_name=HG_REPO, | |
252 | revision=rev, |
|
248 | revision=rev, | |
253 | f_path=f_path)) |
|
249 | f_path=f_path)) | |
254 |
|
250 | |||
255 | msg = """Revision %r does not exist for this repository""" % (rev) |
|
251 | msg = """Revision %r does not exist for this repository""" % (rev) | |
256 | self.checkSessionFlash(response, msg) |
|
252 | self.checkSessionFlash(response, msg) | |
257 |
|
253 | |||
258 | msg = """%s""" % (HG_REPO) |
|
254 | msg = """%s""" % (HG_REPO) | |
259 | self.checkSessionFlash(response, msg) |
|
255 | self.checkSessionFlash(response, msg) | |
260 |
|
256 | |||
261 | def test_raw_file_wrong_f_path(self): |
|
257 | def test_raw_file_wrong_f_path(self): | |
262 | self.log_user() |
|
258 | self.log_user() | |
263 | rev = '27cd5cce30c96924232dffcd24178a07ffeb5dfc' |
|
259 | rev = '27cd5cce30c96924232dffcd24178a07ffeb5dfc' | |
264 | f_path = 'vcs/ERRORnodes.py' |
|
260 | f_path = 'vcs/ERRORnodes.py' | |
265 | response = self.app.get(url(controller='files', action='rawfile', |
|
261 | response = self.app.get(url(controller='files', action='rawfile', | |
266 | repo_name=HG_REPO, |
|
262 | repo_name=HG_REPO, | |
267 | revision=rev, |
|
263 | revision=rev, | |
268 | f_path=f_path)) |
|
264 | f_path=f_path)) | |
269 |
|
265 | |||
270 | msg = "There is no file nor directory at the given path: %r at revision %r" % (f_path, rev[:12]) |
|
266 | msg = "There is no file nor directory at the given path: %r at revision %r" % (f_path, rev[:12]) | |
271 | self.checkSessionFlash(response, msg) |
|
267 | self.checkSessionFlash(response, msg) | |
272 |
|
268 | |||
273 | #========================================================================== |
|
269 | #========================================================================== | |
274 | # RAW RESPONSE - PLAIN |
|
270 | # RAW RESPONSE - PLAIN | |
275 | #========================================================================== |
|
271 | #========================================================================== | |
276 | def test_raw_ok(self): |
|
272 | def test_raw_ok(self): | |
277 | self.log_user() |
|
273 | self.log_user() | |
278 | response = self.app.get(url(controller='files', action='raw', |
|
274 | response = self.app.get(url(controller='files', action='raw', | |
279 | repo_name=HG_REPO, |
|
275 | repo_name=HG_REPO, | |
280 | revision='27cd5cce30c96924232dffcd24178a07ffeb5dfc', |
|
276 | revision='27cd5cce30c96924232dffcd24178a07ffeb5dfc', | |
281 | f_path='vcs/nodes.py')) |
|
277 | f_path='vcs/nodes.py')) | |
282 |
|
278 | |||
283 | assert response.content_type == "text/plain" |
|
279 | assert response.content_type == "text/plain" | |
284 |
|
280 | |||
285 | def test_raw_wrong_cs(self): |
|
281 | def test_raw_wrong_cs(self): | |
286 | self.log_user() |
|
282 | self.log_user() | |
287 | rev = u'ERRORcce30c96924232dffcd24178a07ffeb5dfc' |
|
283 | rev = u'ERRORcce30c96924232dffcd24178a07ffeb5dfc' | |
288 | f_path = 'vcs/nodes.py' |
|
284 | f_path = 'vcs/nodes.py' | |
289 |
|
285 | |||
290 | response = self.app.get(url(controller='files', action='raw', |
|
286 | response = self.app.get(url(controller='files', action='raw', | |
291 | repo_name=HG_REPO, |
|
287 | repo_name=HG_REPO, | |
292 | revision=rev, |
|
288 | revision=rev, | |
293 | f_path=f_path)) |
|
289 | f_path=f_path)) | |
294 | msg = """Revision %r does not exist for this repository""" % (rev) |
|
290 | msg = """Revision %r does not exist for this repository""" % (rev) | |
295 | self.checkSessionFlash(response, msg) |
|
291 | self.checkSessionFlash(response, msg) | |
296 |
|
292 | |||
297 | msg = """%s""" % (HG_REPO) |
|
293 | msg = """%s""" % (HG_REPO) | |
298 | self.checkSessionFlash(response, msg) |
|
294 | self.checkSessionFlash(response, msg) | |
299 |
|
295 | |||
300 | def test_raw_wrong_f_path(self): |
|
296 | def test_raw_wrong_f_path(self): | |
301 | self.log_user() |
|
297 | self.log_user() | |
302 | rev = '27cd5cce30c96924232dffcd24178a07ffeb5dfc' |
|
298 | rev = '27cd5cce30c96924232dffcd24178a07ffeb5dfc' | |
303 | f_path = 'vcs/ERRORnodes.py' |
|
299 | f_path = 'vcs/ERRORnodes.py' | |
304 | response = self.app.get(url(controller='files', action='raw', |
|
300 | response = self.app.get(url(controller='files', action='raw', | |
305 | repo_name=HG_REPO, |
|
301 | repo_name=HG_REPO, | |
306 | revision=rev, |
|
302 | revision=rev, | |
307 | f_path=f_path)) |
|
303 | f_path=f_path)) | |
308 | msg = "There is no file nor directory at the given path: %r at revision %r" % (f_path, rev[:12]) |
|
304 | msg = "There is no file nor directory at the given path: %r at revision %r" % (f_path, rev[:12]) | |
309 | self.checkSessionFlash(response, msg) |
|
305 | self.checkSessionFlash(response, msg) | |
310 |
|
306 | |||
311 | def test_ajaxed_files_list(self): |
|
307 | def test_ajaxed_files_list(self): | |
312 | self.log_user() |
|
308 | self.log_user() | |
313 | rev = '27cd5cce30c96924232dffcd24178a07ffeb5dfc' |
|
309 | rev = '27cd5cce30c96924232dffcd24178a07ffeb5dfc' | |
314 | response = self.app.get( |
|
310 | response = self.app.get( | |
315 | url('files_nodelist_home', repo_name=HG_REPO,f_path='/',revision=rev), |
|
311 | url('files_nodelist_home', repo_name=HG_REPO,f_path='/',revision=rev), | |
316 | extra_environ={'HTTP_X_PARTIAL_XHR': '1'}, |
|
312 | extra_environ={'HTTP_X_PARTIAL_XHR': '1'}, | |
317 | ) |
|
313 | ) | |
318 | response.mustcontain("vcs/web/simplevcs/views/repository.py") |
|
314 | response.mustcontain("vcs/web/simplevcs/views/repository.py") |
General Comments 0
You need to be logged in to leave comments.
Login now