Show More
The requested changes are too big and content was truncated. Show full diff
@@ -0,0 +1,154 b'' | |||||
|
1 | # Created by .ignore support plugin (hsz.mobi) | |||
|
2 | syntax: glob | |||
|
3 | ||||
|
4 | ### Example user template template | |||
|
5 | ### Example user template | |||
|
6 | ||||
|
7 | # IntelliJ project files | |||
|
8 | .idea | |||
|
9 | *.iml | |||
|
10 | out | |||
|
11 | gen### Python template | |||
|
12 | # Byte-compiled / optimized / DLL files | |||
|
13 | __pycache__/ | |||
|
14 | *.py[cod] | |||
|
15 | *$py.class | |||
|
16 | ||||
|
17 | # C extensions | |||
|
18 | *.so | |||
|
19 | ||||
|
20 | # Distribution / packaging | |||
|
21 | .Python | |||
|
22 | env/ | |||
|
23 | build/ | |||
|
24 | develop-eggs/ | |||
|
25 | dist/ | |||
|
26 | downloads/ | |||
|
27 | eggs/ | |||
|
28 | .eggs/ | |||
|
29 | lib/ | |||
|
30 | lib64/ | |||
|
31 | parts/ | |||
|
32 | sdist/ | |||
|
33 | var/ | |||
|
34 | *.egg-info/ | |||
|
35 | .installed.cfg | |||
|
36 | *.egg | |||
|
37 | ||||
|
38 | # PyInstaller | |||
|
39 | # Usually these files are written by a python script from a template | |||
|
40 | # before PyInstaller builds the exe, so as to inject date/other infos into it. | |||
|
41 | *.manifest | |||
|
42 | *.spec | |||
|
43 | ||||
|
44 | # Installer logs | |||
|
45 | pip-log.txt | |||
|
46 | pip-delete-this-directory.txt | |||
|
47 | ||||
|
48 | # Unit test / coverage reports | |||
|
49 | htmlcov/ | |||
|
50 | .tox/ | |||
|
51 | .coverage | |||
|
52 | .coverage.* | |||
|
53 | .cache | |||
|
54 | nosetests.xml | |||
|
55 | coverage.xml | |||
|
56 | *,cover | |||
|
57 | .hypothesis/ | |||
|
58 | ||||
|
59 | # Translations | |||
|
60 | *.mo | |||
|
61 | *.pot | |||
|
62 | ||||
|
63 | # Django stuff: | |||
|
64 | *.log | |||
|
65 | local_settings.py | |||
|
66 | ||||
|
67 | # Flask instance folder | |||
|
68 | instance/ | |||
|
69 | ||||
|
70 | # Scrapy stuff: | |||
|
71 | .scrapy | |||
|
72 | ||||
|
73 | # Sphinx documentation | |||
|
74 | docs/_build/ | |||
|
75 | ||||
|
76 | # PyBuilder | |||
|
77 | target/ | |||
|
78 | ||||
|
79 | # IPython Notebook | |||
|
80 | .ipynb_checkpoints | |||
|
81 | ||||
|
82 | # pyenv | |||
|
83 | .python-version | |||
|
84 | ||||
|
85 | # celery beat schedule file | |||
|
86 | celerybeat-schedule | |||
|
87 | ||||
|
88 | # dotenv | |||
|
89 | .env | |||
|
90 | ||||
|
91 | # virtualenv | |||
|
92 | venv/ | |||
|
93 | ENV/ | |||
|
94 | ||||
|
95 | # Spyder project settings | |||
|
96 | .spyderproject | |||
|
97 | ||||
|
98 | # Rope project settings | |||
|
99 | .ropeproject | |||
|
100 | ||||
|
101 | ||||
|
102 | syntax: regexp | |||
|
103 | ^\.idea$ | |||
|
104 | syntax: regexp | |||
|
105 | ^\.settings$ | |||
|
106 | syntax: regexp | |||
|
107 | ^data$ | |||
|
108 | syntax: regexp | |||
|
109 | ^webassets$ | |||
|
110 | syntax: regexp | |||
|
111 | ^dist$ | |||
|
112 | syntax: regexp | |||
|
113 | ^\.project$ | |||
|
114 | syntax: regexp | |||
|
115 | ^\.pydevproject$ | |||
|
116 | syntax: regexp | |||
|
117 | ^private$ | |||
|
118 | syntax: regexp | |||
|
119 | ^appenlight_frontend/build$ | |||
|
120 | syntax: regexp | |||
|
121 | ^appenlight_frontend/bower_components$ | |||
|
122 | syntax: regexp | |||
|
123 | ^appenlight_frontend/node_modules$ | |||
|
124 | ^src/node_modules$ | |||
|
125 | syntax: regexp | |||
|
126 | ^\.pydevproject$ | |||
|
127 | syntax: regexp | |||
|
128 | appenlight\.egg-info$ | |||
|
129 | syntax: regexp | |||
|
130 | \.pyc$ | |||
|
131 | syntax: regexp | |||
|
132 | \celerybeat.* | |||
|
133 | syntax: regexp | |||
|
134 | \.iml$ | |||
|
135 | syntax: regexp | |||
|
136 | ^frontend/build$ | |||
|
137 | syntax: regexp | |||
|
138 | ^frontend/bower_components$ | |||
|
139 | syntax: regexp | |||
|
140 | ^frontend/node_modules$ | |||
|
141 | ^frontend/src/node_modules$ | |||
|
142 | ^frontend/build$ | |||
|
143 | ||||
|
144 | syntax: regexp | |||
|
145 | \.db$ | |||
|
146 | ||||
|
147 | syntax: regexp | |||
|
148 | packer_cache | |||
|
149 | ||||
|
150 | syntax: regexp | |||
|
151 | packer/packer | |||
|
152 | ||||
|
153 | syntax: regexp | |||
|
154 | install_appenlight_production.yaml |
This diff has been collapsed as it changes many lines, (704 lines changed) Show them Hide them | |||||
@@ -0,0 +1,704 b'' | |||||
|
1 | This program is free software: you can redistribute it and/or modify | |||
|
2 | it under the terms of the GNU Affero General Public License, version 3 | |||
|
3 | (only), as published by the Free Software Foundation. | |||
|
4 | ||||
|
5 | ||||
|
6 | This program incorporates work covered by the following copyright and | |||
|
7 | permission notice: | |||
|
8 | ||||
|
9 | Copyright (c) 2014-2016 - packaging | |||
|
10 | file: | |||
|
11 | Copyright (c) 2008-2011 - msgpack-python | |||
|
12 | file:licenses/msgpack_license.txt | |||
|
13 | Copyright (c) 2007-2008 - amqp | |||
|
14 | file:licenses/amqp_license.txt | |||
|
15 | Copyright (c) 2013 - bcrypt | |||
|
16 | file:licenses/bcrypt_license.txt | |||
|
17 | Copyright (c) 2015 - elasticsearch | |||
|
18 | file:licenses/elasticsearch_license.txt | |||
|
19 | Copyright (c) 2011-2013 - gevent-websocket | |||
|
20 | file:licenses/gevent_websocket_license.txt | |||
|
21 | Copyright (c) 2015 - python-editor | |||
|
22 | file:licenses/python_editor_license.txt | |||
|
23 | Copyright (c) 2015 - requests | |||
|
24 | file:licenses/requests_license.txt | |||
|
25 | Copyright (c) 2014 - requests-toolbelt | |||
|
26 | file:licenses/requests_toolbelt_license.txt | |||
|
27 | ||||
|
28 | Both licensed under the Apache License, Version 2.0 (the "License"); | |||
|
29 | you may not use this file except in compliance with the License. | |||
|
30 | You may obtain a copy of the License at | |||
|
31 | ||||
|
32 | http://www.apache.org/licenses/LICENSE-2.0 | |||
|
33 | ||||
|
34 | Unless required by applicable law or agreed to in writing, software | |||
|
35 | distributed under the License is distributed on an "AS IS" BASIS, | |||
|
36 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||
|
37 | See the License for the specific language governing permissions and | |||
|
38 | imitations under the License. | |||
|
39 | ||||
|
40 | ||||
|
41 | Below is the full text of GNU Affero General Public License, version 3 | |||
|
42 | ||||
|
43 | ||||
|
44 | GNU AFFERO GENERAL PUBLIC LICENSE | |||
|
45 | Version 3, 19 November 2007 | |||
|
46 | ||||
|
47 | Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/> | |||
|
48 | Everyone is permitted to copy and distribute verbatim copies | |||
|
49 | of this license document, but changing it is not allowed. | |||
|
50 | ||||
|
51 | Preamble | |||
|
52 | ||||
|
53 | The GNU Affero General Public License is a free, copyleft license for | |||
|
54 | software and other kinds of works, specifically designed to ensure | |||
|
55 | cooperation with the community in the case of network server software. | |||
|
56 | ||||
|
57 | The licenses for most software and other practical works are designed | |||
|
58 | to take away your freedom to share and change the works. By contrast, | |||
|
59 | our General Public Licenses are intended to guarantee your freedom to | |||
|
60 | share and change all versions of a program--to make sure it remains free | |||
|
61 | software for all its users. | |||
|
62 | ||||
|
63 | When we speak of free software, we are referring to freedom, not | |||
|
64 | price. Our General Public Licenses are designed to make sure that you | |||
|
65 | have the freedom to distribute copies of free software (and charge for | |||
|
66 | them if you wish), that you receive source code or can get it if you | |||
|
67 | want it, that you can change the software or use pieces of it in new | |||
|
68 | free programs, and that you know you can do these things. | |||
|
69 | ||||
|
70 | Developers that use our General Public Licenses protect your rights | |||
|
71 | with two steps: (1) assert copyright on the software, and (2) offer | |||
|
72 | you this License which gives you legal permission to copy, distribute | |||
|
73 | and/or modify the software. | |||
|
74 | ||||
|
75 | A secondary benefit of defending all users' freedom is that | |||
|
76 | improvements made in alternate versions of the program, if they | |||
|
77 | receive widespread use, become available for other developers to | |||
|
78 | incorporate. Many developers of free software are heartened and | |||
|
79 | encouraged by the resulting cooperation. However, in the case of | |||
|
80 | software used on network servers, this result may fail to come about. | |||
|
81 | The GNU General Public License permits making a modified version and | |||
|
82 | letting the public access it on a server without ever releasing its | |||
|
83 | source code to the public. | |||
|
84 | ||||
|
85 | The GNU Affero General Public License is designed specifically to | |||
|
86 | ensure that, in such cases, the modified source code becomes available | |||
|
87 | to the community. It requires the operator of a network server to | |||
|
88 | provide the source code of the modified version running there to the | |||
|
89 | users of that server. Therefore, public use of a modified version, on | |||
|
90 | a publicly accessible server, gives the public access to the source | |||
|
91 | code of the modified version. | |||
|
92 | ||||
|
93 | An older license, called the Affero General Public License and | |||
|
94 | published by Affero, was designed to accomplish similar goals. This is | |||
|
95 | a different license, not a version of the Affero GPL, but Affero has | |||
|
96 | released a new version of the Affero GPL which permits relicensing under | |||
|
97 | this license. | |||
|
98 | ||||
|
99 | The precise terms and conditions for copying, distribution and | |||
|
100 | modification follow. | |||
|
101 | ||||
|
102 | TERMS AND CONDITIONS | |||
|
103 | ||||
|
104 | 0. Definitions. | |||
|
105 | ||||
|
106 | "This License" refers to version 3 of the GNU Affero General Public License. | |||
|
107 | ||||
|
108 | "Copyright" also means copyright-like laws that apply to other kinds of | |||
|
109 | works, such as semiconductor masks. | |||
|
110 | ||||
|
111 | "The Program" refers to any copyrightable work licensed under this | |||
|
112 | License. Each licensee is addressed as "you". "Licensees" and | |||
|
113 | "recipients" may be individuals or organizations. | |||
|
114 | ||||
|
115 | To "modify" a work means to copy from or adapt all or part of the work | |||
|
116 | in a fashion requiring copyright permission, other than the making of an | |||
|
117 | exact copy. The resulting work is called a "modified version" of the | |||
|
118 | earlier work or a work "based on" the earlier work. | |||
|
119 | ||||
|
120 | A "covered work" means either the unmodified Program or a work based | |||
|
121 | on the Program. | |||
|
122 | ||||
|
123 | To "propagate" a work means to do anything with it that, without | |||
|
124 | permission, would make you directly or secondarily liable for | |||
|
125 | infringement under applicable copyright law, except executing it on a | |||
|
126 | computer or modifying a private copy. Propagation includes copying, | |||
|
127 | distribution (with or without modification), making available to the | |||
|
128 | public, and in some countries other activities as well. | |||
|
129 | ||||
|
130 | To "convey" a work means any kind of propagation that enables other | |||
|
131 | parties to make or receive copies. Mere interaction with a user through | |||
|
132 | a computer network, with no transfer of a copy, is not conveying. | |||
|
133 | ||||
|
134 | An interactive user interface displays "Appropriate Legal Notices" | |||
|
135 | to the extent that it includes a convenient and prominently visible | |||
|
136 | feature that (1) displays an appropriate copyright notice, and (2) | |||
|
137 | tells the user that there is no warranty for the work (except to the | |||
|
138 | extent that warranties are provided), that licensees may convey the | |||
|
139 | work under this License, and how to view a copy of this License. If | |||
|
140 | the interface presents a list of user commands or options, such as a | |||
|
141 | menu, a prominent item in the list meets this criterion. | |||
|
142 | ||||
|
143 | 1. Source Code. | |||
|
144 | ||||
|
145 | The "source code" for a work means the preferred form of the work | |||
|
146 | for making modifications to it. "Object code" means any non-source | |||
|
147 | form of a work. | |||
|
148 | ||||
|
149 | A "Standard Interface" means an interface that either is an official | |||
|
150 | standard defined by a recognized standards body, or, in the case of | |||
|
151 | interfaces specified for a particular programming language, one that | |||
|
152 | is widely used among developers working in that language. | |||
|
153 | ||||
|
154 | The "System Libraries" of an executable work include anything, other | |||
|
155 | than the work as a whole, that (a) is included in the normal form of | |||
|
156 | packaging a Major Component, but which is not part of that Major | |||
|
157 | Component, and (b) serves only to enable use of the work with that | |||
|
158 | Major Component, or to implement a Standard Interface for which an | |||
|
159 | implementation is available to the public in source code form. A | |||
|
160 | "Major Component", in this context, means a major essential component | |||
|
161 | (kernel, window system, and so on) of the specific operating system | |||
|
162 | (if any) on which the executable work runs, or a compiler used to | |||
|
163 | produce the work, or an object code interpreter used to run it. | |||
|
164 | ||||
|
165 | The "Corresponding Source" for a work in object code form means all | |||
|
166 | the source code needed to generate, install, and (for an executable | |||
|
167 | work) run the object code and to modify the work, including scripts to | |||
|
168 | control those activities. However, it does not include the work's | |||
|
169 | System Libraries, or general-purpose tools or generally available free | |||
|
170 | programs which are used unmodified in performing those activities but | |||
|
171 | which are not part of the work. For example, Corresponding Source | |||
|
172 | includes interface definition files associated with source files for | |||
|
173 | the work, and the source code for shared libraries and dynamically | |||
|
174 | linked subprograms that the work is specifically designed to require, | |||
|
175 | such as by intimate data communication or control flow between those | |||
|
176 | subprograms and other parts of the work. | |||
|
177 | ||||
|
178 | The Corresponding Source need not include anything that users | |||
|
179 | can regenerate automatically from other parts of the Corresponding | |||
|
180 | Source. | |||
|
181 | ||||
|
182 | The Corresponding Source for a work in source code form is that | |||
|
183 | same work. | |||
|
184 | ||||
|
185 | 2. Basic Permissions. | |||
|
186 | ||||
|
187 | All rights granted under this License are granted for the term of | |||
|
188 | copyright on the Program, and are irrevocable provided the stated | |||
|
189 | conditions are met. This License explicitly affirms your unlimited | |||
|
190 | permission to run the unmodified Program. The output from running a | |||
|
191 | covered work is covered by this License only if the output, given its | |||
|
192 | content, constitutes a covered work. This License acknowledges your | |||
|
193 | rights of fair use or other equivalent, as provided by copyright law. | |||
|
194 | ||||
|
195 | You may make, run and propagate covered works that you do not | |||
|
196 | convey, without conditions so long as your license otherwise remains | |||
|
197 | in force. You may convey covered works to others for the sole purpose | |||
|
198 | of having them make modifications exclusively for you, or provide you | |||
|
199 | with facilities for running those works, provided that you comply with | |||
|
200 | the terms of this License in conveying all material for which you do | |||
|
201 | not control copyright. Those thus making or running the covered works | |||
|
202 | for you must do so exclusively on your behalf, under your direction | |||
|
203 | and control, on terms that prohibit them from making any copies of | |||
|
204 | your copyrighted material outside their relationship with you. | |||
|
205 | ||||
|
206 | Conveying under any other circumstances is permitted solely under | |||
|
207 | the conditions stated below. Sublicensing is not allowed; section 10 | |||
|
208 | makes it unnecessary. | |||
|
209 | ||||
|
210 | 3. Protecting Users' Legal Rights From Anti-Circumvention Law. | |||
|
211 | ||||
|
212 | No covered work shall be deemed part of an effective technological | |||
|
213 | measure under any applicable law fulfilling obligations under article | |||
|
214 | 11 of the WIPO copyright treaty adopted on 20 December 1996, or | |||
|
215 | similar laws prohibiting or restricting circumvention of such | |||
|
216 | measures. | |||
|
217 | ||||
|
218 | When you convey a covered work, you waive any legal power to forbid | |||
|
219 | circumvention of technological measures to the extent such circumvention | |||
|
220 | is effected by exercising rights under this License with respect to | |||
|
221 | the covered work, and you disclaim any intention to limit operation or | |||
|
222 | modification of the work as a means of enforcing, against the work's | |||
|
223 | users, your or third parties' legal rights to forbid circumvention of | |||
|
224 | technological measures. | |||
|
225 | ||||
|
226 | 4. Conveying Verbatim Copies. | |||
|
227 | ||||
|
228 | You may convey verbatim copies of the Program's source code as you | |||
|
229 | receive it, in any medium, provided that you conspicuously and | |||
|
230 | appropriately publish on each copy an appropriate copyright notice; | |||
|
231 | keep intact all notices stating that this License and any | |||
|
232 | non-permissive terms added in accord with section 7 apply to the code; | |||
|
233 | keep intact all notices of the absence of any warranty; and give all | |||
|
234 | recipients a copy of this License along with the Program. | |||
|
235 | ||||
|
236 | You may charge any price or no price for each copy that you convey, | |||
|
237 | and you may offer support or warranty protection for a fee. | |||
|
238 | ||||
|
239 | 5. Conveying Modified Source Versions. | |||
|
240 | ||||
|
241 | You may convey a work based on the Program, or the modifications to | |||
|
242 | produce it from the Program, in the form of source code under the | |||
|
243 | terms of section 4, provided that you also meet all of these conditions: | |||
|
244 | ||||
|
245 | a) The work must carry prominent notices stating that you modified | |||
|
246 | it, and giving a relevant date. | |||
|
247 | ||||
|
248 | b) The work must carry prominent notices stating that it is | |||
|
249 | released under this License and any conditions added under section | |||
|
250 | 7. This requirement modifies the requirement in section 4 to | |||
|
251 | "keep intact all notices". | |||
|
252 | ||||
|
253 | c) You must license the entire work, as a whole, under this | |||
|
254 | License to anyone who comes into possession of a copy. This | |||
|
255 | License will therefore apply, along with any applicable section 7 | |||
|
256 | additional terms, to the whole of the work, and all its parts, | |||
|
257 | regardless of how they are packaged. This License gives no | |||
|
258 | permission to license the work in any other way, but it does not | |||
|
259 | invalidate such permission if you have separately received it. | |||
|
260 | ||||
|
261 | d) If the work has interactive user interfaces, each must display | |||
|
262 | Appropriate Legal Notices; however, if the Program has interactive | |||
|
263 | interfaces that do not display Appropriate Legal Notices, your | |||
|
264 | work need not make them do so. | |||
|
265 | ||||
|
266 | A compilation of a covered work with other separate and independent | |||
|
267 | works, which are not by their nature extensions of the covered work, | |||
|
268 | and which are not combined with it such as to form a larger program, | |||
|
269 | in or on a volume of a storage or distribution medium, is called an | |||
|
270 | "aggregate" if the compilation and its resulting copyright are not | |||
|
271 | used to limit the access or legal rights of the compilation's users | |||
|
272 | beyond what the individual works permit. Inclusion of a covered work | |||
|
273 | in an aggregate does not cause this License to apply to the other | |||
|
274 | parts of the aggregate. | |||
|
275 | ||||
|
276 | 6. Conveying Non-Source Forms. | |||
|
277 | ||||
|
278 | You may convey a covered work in object code form under the terms | |||
|
279 | of sections 4 and 5, provided that you also convey the | |||
|
280 | machine-readable Corresponding Source under the terms of this License, | |||
|
281 | in one of these ways: | |||
|
282 | ||||
|
283 | a) Convey the object code in, or embodied in, a physical product | |||
|
284 | (including a physical distribution medium), accompanied by the | |||
|
285 | Corresponding Source fixed on a durable physical medium | |||
|
286 | customarily used for software interchange. | |||
|
287 | ||||
|
288 | b) Convey the object code in, or embodied in, a physical product | |||
|
289 | (including a physical distribution medium), accompanied by a | |||
|
290 | written offer, valid for at least three years and valid for as | |||
|
291 | long as you offer spare parts or customer support for that product | |||
|
292 | model, to give anyone who possesses the object code either (1) a | |||
|
293 | copy of the Corresponding Source for all the software in the | |||
|
294 | product that is covered by this License, on a durable physical | |||
|
295 | medium customarily used for software interchange, for a price no | |||
|
296 | more than your reasonable cost of physically performing this | |||
|
297 | conveying of source, or (2) access to copy the | |||
|
298 | Corresponding Source from a network server at no charge. | |||
|
299 | ||||
|
300 | c) Convey individual copies of the object code with a copy of the | |||
|
301 | written offer to provide the Corresponding Source. This | |||
|
302 | alternative is allowed only occasionally and noncommercially, and | |||
|
303 | only if you received the object code with such an offer, in accord | |||
|
304 | with subsection 6b. | |||
|
305 | ||||
|
306 | d) Convey the object code by offering access from a designated | |||
|
307 | place (gratis or for a charge), and offer equivalent access to the | |||
|
308 | Corresponding Source in the same way through the same place at no | |||
|
309 | further charge. You need not require recipients to copy the | |||
|
310 | Corresponding Source along with the object code. If the place to | |||
|
311 | copy the object code is a network server, the Corresponding Source | |||
|
312 | may be on a different server (operated by you or a third party) | |||
|
313 | that supports equivalent copying facilities, provided you maintain | |||
|
314 | clear directions next to the object code saying where to find the | |||
|
315 | Corresponding Source. Regardless of what server hosts the | |||
|
316 | Corresponding Source, you remain obligated to ensure that it is | |||
|
317 | available for as long as needed to satisfy these requirements. | |||
|
318 | ||||
|
319 | e) Convey the object code using peer-to-peer transmission, provided | |||
|
320 | you inform other peers where the object code and Corresponding | |||
|
321 | Source of the work are being offered to the general public at no | |||
|
322 | charge under subsection 6d. | |||
|
323 | ||||
|
324 | A separable portion of the object code, whose source code is excluded | |||
|
325 | from the Corresponding Source as a System Library, need not be | |||
|
326 | included in conveying the object code work. | |||
|
327 | ||||
|
328 | A "User Product" is either (1) a "consumer product", which means any | |||
|
329 | tangible personal property which is normally used for personal, family, | |||
|
330 | or household purposes, or (2) anything designed or sold for incorporation | |||
|
331 | into a dwelling. In determining whether a product is a consumer product, | |||
|
332 | doubtful cases shall be resolved in favor of coverage. For a particular | |||
|
333 | product received by a particular user, "normally used" refers to a | |||
|
334 | typical or common use of that class of product, regardless of the status | |||
|
335 | of the particular user or of the way in which the particular user | |||
|
336 | actually uses, or expects or is expected to use, the product. A product | |||
|
337 | is a consumer product regardless of whether the product has substantial | |||
|
338 | commercial, industrial or non-consumer uses, unless such uses represent | |||
|
339 | the only significant mode of use of the product. | |||
|
340 | ||||
|
341 | "Installation Information" for a User Product means any methods, | |||
|
342 | procedures, authorization keys, or other information required to install | |||
|
343 | and execute modified versions of a covered work in that User Product from | |||
|
344 | a modified version of its Corresponding Source. The information must | |||
|
345 | suffice to ensure that the continued functioning of the modified object | |||
|
346 | code is in no case prevented or interfered with solely because | |||
|
347 | modification has been made. | |||
|
348 | ||||
|
349 | If you convey an object code work under this section in, or with, or | |||
|
350 | specifically for use in, a User Product, and the conveying occurs as | |||
|
351 | part of a transaction in which the right of possession and use of the | |||
|
352 | User Product is transferred to the recipient in perpetuity or for a | |||
|
353 | fixed term (regardless of how the transaction is characterized), the | |||
|
354 | Corresponding Source conveyed under this section must be accompanied | |||
|
355 | by the Installation Information. But this requirement does not apply | |||
|
356 | if neither you nor any third party retains the ability to install | |||
|
357 | modified object code on the User Product (for example, the work has | |||
|
358 | been installed in ROM). | |||
|
359 | ||||
|
360 | The requirement to provide Installation Information does not include a | |||
|
361 | requirement to continue to provide support service, warranty, or updates | |||
|
362 | for a work that has been modified or installed by the recipient, or for | |||
|
363 | the User Product in which it has been modified or installed. Access to a | |||
|
364 | network may be denied when the modification itself materially and | |||
|
365 | adversely affects the operation of the network or violates the rules and | |||
|
366 | protocols for communication across the network. | |||
|
367 | ||||
|
368 | Corresponding Source conveyed, and Installation Information provided, | |||
|
369 | in accord with this section must be in a format that is publicly | |||
|
370 | documented (and with an implementation available to the public in | |||
|
371 | source code form), and must require no special password or key for | |||
|
372 | unpacking, reading or copying. | |||
|
373 | ||||
|
374 | 7. Additional Terms. | |||
|
375 | ||||
|
376 | "Additional permissions" are terms that supplement the terms of this | |||
|
377 | License by making exceptions from one or more of its conditions. | |||
|
378 | Additional permissions that are applicable to the entire Program shall | |||
|
379 | be treated as though they were included in this License, to the extent | |||
|
380 | that they are valid under applicable law. If additional permissions | |||
|
381 | apply only to part of the Program, that part may be used separately | |||
|
382 | under those permissions, but the entire Program remains governed by | |||
|
383 | this License without regard to the additional permissions. | |||
|
384 | ||||
|
385 | When you convey a copy of a covered work, you may at your option | |||
|
386 | remove any additional permissions from that copy, or from any part of | |||
|
387 | it. (Additional permissions may be written to require their own | |||
|
388 | removal in certain cases when you modify the work.) You may place | |||
|
389 | additional permissions on material, added by you to a covered work, | |||
|
390 | for which you have or can give appropriate copyright permission. | |||
|
391 | ||||
|
392 | Notwithstanding any other provision of this License, for material you | |||
|
393 | add to a covered work, you may (if authorized by the copyright holders of | |||
|
394 | that material) supplement the terms of this License with terms: | |||
|
395 | ||||
|
396 | a) Disclaiming warranty or limiting liability differently from the | |||
|
397 | terms of sections 15 and 16 of this License; or | |||
|
398 | ||||
|
399 | b) Requiring preservation of specified reasonable legal notices or | |||
|
400 | author attributions in that material or in the Appropriate Legal | |||
|
401 | Notices displayed by works containing it; or | |||
|
402 | ||||
|
403 | c) Prohibiting misrepresentation of the origin of that material, or | |||
|
404 | requiring that modified versions of such material be marked in | |||
|
405 | reasonable ways as different from the original version; or | |||
|
406 | ||||
|
407 | d) Limiting the use for publicity purposes of names of licensors or | |||
|
408 | authors of the material; or | |||
|
409 | ||||
|
410 | e) Declining to grant rights under trademark law for use of some | |||
|
411 | trade names, trademarks, or service marks; or | |||
|
412 | ||||
|
413 | f) Requiring indemnification of licensors and authors of that | |||
|
414 | material by anyone who conveys the material (or modified versions of | |||
|
415 | it) with contractual assumptions of liability to the recipient, for | |||
|
416 | any liability that these contractual assumptions directly impose on | |||
|
417 | those licensors and authors. | |||
|
418 | ||||
|
419 | All other non-permissive additional terms are considered "further | |||
|
420 | restrictions" within the meaning of section 10. If the Program as you | |||
|
421 | received it, or any part of it, contains a notice stating that it is | |||
|
422 | governed by this License along with a term that is a further | |||
|
423 | restriction, you may remove that term. If a license document contains | |||
|
424 | a further restriction but permits relicensing or conveying under this | |||
|
425 | License, you may add to a covered work material governed by the terms | |||
|
426 | of that license document, provided that the further restriction does | |||
|
427 | not survive such relicensing or conveying. | |||
|
428 | ||||
|
429 | If you add terms to a covered work in accord with this section, you | |||
|
430 | must place, in the relevant source files, a statement of the | |||
|
431 | additional terms that apply to those files, or a notice indicating | |||
|
432 | where to find the applicable terms. | |||
|
433 | ||||
|
434 | Additional terms, permissive or non-permissive, may be stated in the | |||
|
435 | form of a separately written license, or stated as exceptions; | |||
|
436 | the above requirements apply either way. | |||
|
437 | ||||
|
438 | 8. Termination. | |||
|
439 | ||||
|
440 | You may not propagate or modify a covered work except as expressly | |||
|
441 | provided under this License. Any attempt otherwise to propagate or | |||
|
442 | modify it is void, and will automatically terminate your rights under | |||
|
443 | this License (including any patent licenses granted under the third | |||
|
444 | paragraph of section 11). | |||
|
445 | ||||
|
446 | However, if you cease all violation of this License, then your | |||
|
447 | license from a particular copyright holder is reinstated (a) | |||
|
448 | provisionally, unless and until the copyright holder explicitly and | |||
|
449 | finally terminates your license, and (b) permanently, if the copyright | |||
|
450 | holder fails to notify you of the violation by some reasonable means | |||
|
451 | prior to 60 days after the cessation. | |||
|
452 | ||||
|
453 | Moreover, your license from a particular copyright holder is | |||
|
454 | reinstated permanently if the copyright holder notifies you of the | |||
|
455 | violation by some reasonable means, this is the first time you have | |||
|
456 | received notice of violation of this License (for any work) from that | |||
|
457 | copyright holder, and you cure the violation prior to 30 days after | |||
|
458 | your receipt of the notice. | |||
|
459 | ||||
|
460 | Termination of your rights under this section does not terminate the | |||
|
461 | licenses of parties who have received copies or rights from you under | |||
|
462 | this License. If your rights have been terminated and not permanently | |||
|
463 | reinstated, you do not qualify to receive new licenses for the same | |||
|
464 | material under section 10. | |||
|
465 | ||||
|
466 | 9. Acceptance Not Required for Having Copies. | |||
|
467 | ||||
|
468 | You are not required to accept this License in order to receive or | |||
|
469 | run a copy of the Program. Ancillary propagation of a covered work | |||
|
470 | occurring solely as a consequence of using peer-to-peer transmission | |||
|
471 | to receive a copy likewise does not require acceptance. However, | |||
|
472 | nothing other than this License grants you permission to propagate or | |||
|
473 | modify any covered work. These actions infringe copyright if you do | |||
|
474 | not accept this License. Therefore, by modifying or propagating a | |||
|
475 | covered work, you indicate your acceptance of this License to do so. | |||
|
476 | ||||
|
477 | 10. Automatic Licensing of Downstream Recipients. | |||
|
478 | ||||
|
479 | Each time you convey a covered work, the recipient automatically | |||
|
480 | receives a license from the original licensors, to run, modify and | |||
|
481 | propagate that work, subject to this License. You are not responsible | |||
|
482 | for enforcing compliance by third parties with this License. | |||
|
483 | ||||
|
484 | An "entity transaction" is a transaction transferring control of an | |||
|
485 | organization, or substantially all assets of one, or subdividing an | |||
|
486 | organization, or merging organizations. If propagation of a covered | |||
|
487 | work results from an entity transaction, each party to that | |||
|
488 | transaction who receives a copy of the work also receives whatever | |||
|
489 | licenses to the work the party's predecessor in interest had or could | |||
|
490 | give under the previous paragraph, plus a right to possession of the | |||
|
491 | Corresponding Source of the work from the predecessor in interest, if | |||
|
492 | the predecessor has it or can get it with reasonable efforts. | |||
|
493 | ||||
|
494 | You may not impose any further restrictions on the exercise of the | |||
|
495 | rights granted or affirmed under this License. For example, you may | |||
|
496 | not impose a license fee, royalty, or other charge for exercise of | |||
|
497 | rights granted under this License, and you may not initiate litigation | |||
|
498 | (including a cross-claim or counterclaim in a lawsuit) alleging that | |||
|
499 | any patent claim is infringed by making, using, selling, offering for | |||
|
500 | sale, or importing the Program or any portion of it. | |||
|
501 | ||||
|
502 | 11. Patents. | |||
|
503 | ||||
|
504 | A "contributor" is a copyright holder who authorizes use under this | |||
|
505 | License of the Program or a work on which the Program is based. The | |||
|
506 | work thus licensed is called the contributor's "contributor version". | |||
|
507 | ||||
|
508 | A contributor's "essential patent claims" are all patent claims | |||
|
509 | owned or controlled by the contributor, whether already acquired or | |||
|
510 | hereafter acquired, that would be infringed by some manner, permitted | |||
|
511 | by this License, of making, using, or selling its contributor version, | |||
|
512 | but do not include claims that would be infringed only as a | |||
|
513 | consequence of further modification of the contributor version. For | |||
|
514 | purposes of this definition, "control" includes the right to grant | |||
|
515 | patent sublicenses in a manner consistent with the requirements of | |||
|
516 | this License. | |||
|
517 | ||||
|
518 | Each contributor grants you a non-exclusive, worldwide, royalty-free | |||
|
519 | patent license under the contributor's essential patent claims, to | |||
|
520 | make, use, sell, offer for sale, import and otherwise run, modify and | |||
|
521 | propagate the contents of its contributor version. | |||
|
522 | ||||
|
523 | In the following three paragraphs, a "patent license" is any express | |||
|
524 | agreement or commitment, however denominated, not to enforce a patent | |||
|
525 | (such as an express permission to practice a patent or covenant not to | |||
|
526 | sue for patent infringement). To "grant" such a patent license to a | |||
|
527 | party means to make such an agreement or commitment not to enforce a | |||
|
528 | patent against the party. | |||
|
529 | ||||
|
530 | If you convey a covered work, knowingly relying on a patent license, | |||
|
531 | and the Corresponding Source of the work is not available for anyone | |||
|
532 | to copy, free of charge and under the terms of this License, through a | |||
|
533 | publicly available network server or other readily accessible means, | |||
|
534 | then you must either (1) cause the Corresponding Source to be so | |||
|
535 | available, or (2) arrange to deprive yourself of the benefit of the | |||
|
536 | patent license for this particular work, or (3) arrange, in a manner | |||
|
537 | consistent with the requirements of this License, to extend the patent | |||
|
538 | license to downstream recipients. "Knowingly relying" means you have | |||
|
539 | actual knowledge that, but for the patent license, your conveying the | |||
|
540 | covered work in a country, or your recipient's use of the covered work | |||
|
541 | in a country, would infringe one or more identifiable patents in that | |||
|
542 | country that you have reason to believe are valid. | |||
|
543 | ||||
|
544 | If, pursuant to or in connection with a single transaction or | |||
|
545 | arrangement, you convey, or propagate by procuring conveyance of, a | |||
|
546 | covered work, and grant a patent license to some of the parties | |||
|
547 | receiving the covered work authorizing them to use, propagate, modify | |||
|
548 | or convey a specific copy of the covered work, then the patent license | |||
|
549 | you grant is automatically extended to all recipients of the covered | |||
|
550 | work and works based on it. | |||
|
551 | ||||
|
552 | A patent license is "discriminatory" if it does not include within | |||
|
553 | the scope of its coverage, prohibits the exercise of, or is | |||
|
554 | conditioned on the non-exercise of one or more of the rights that are | |||
|
555 | specifically granted under this License. You may not convey a covered | |||
|
556 | work if you are a party to an arrangement with a third party that is | |||
|
557 | in the business of distributing software, under which you make payment | |||
|
558 | to the third party based on the extent of your activity of conveying | |||
|
559 | the work, and under which the third party grants, to any of the | |||
|
560 | parties who would receive the covered work from you, a discriminatory | |||
|
561 | patent license (a) in connection with copies of the covered work | |||
|
562 | conveyed by you (or copies made from those copies), or (b) primarily | |||
|
563 | for and in connection with specific products or compilations that | |||
|
564 | contain the covered work, unless you entered into that arrangement, | |||
|
565 | or that patent license was granted, prior to 28 March 2007. | |||
|
566 | ||||
|
567 | Nothing in this License shall be construed as excluding or limiting | |||
|
568 | any implied license or other defenses to infringement that may | |||
|
569 | otherwise be available to you under applicable patent law. | |||
|
570 | ||||
|
571 | 12. No Surrender of Others' Freedom. | |||
|
572 | ||||
|
573 | If conditions are imposed on you (whether by court order, agreement or | |||
|
574 | otherwise) that contradict the conditions of this License, they do not | |||
|
575 | excuse you from the conditions of this License. If you cannot convey a | |||
|
576 | covered work so as to satisfy simultaneously your obligations under this | |||
|
577 | License and any other pertinent obligations, then as a consequence you may | |||
|
578 | not convey it at all. For example, if you agree to terms that obligate you | |||
|
579 | to collect a royalty for further conveying from those to whom you convey | |||
|
580 | the Program, the only way you could satisfy both those terms and this | |||
|
581 | License would be to refrain entirely from conveying the Program. | |||
|
582 | ||||
|
583 | 13. Remote Network Interaction; Use with the GNU General Public License. | |||
|
584 | ||||
|
585 | Notwithstanding any other provision of this License, if you modify the | |||
|
586 | Program, your modified version must prominently offer all users | |||
|
587 | interacting with it remotely through a computer network (if your version | |||
|
588 | supports such interaction) an opportunity to receive the Corresponding | |||
|
589 | Source of your version by providing access to the Corresponding Source | |||
|
590 | from a network server at no charge, through some standard or customary | |||
|
591 | means of facilitating copying of software. This Corresponding Source | |||
|
592 | shall include the Corresponding Source for any work covered by version 3 | |||
|
593 | of the GNU General Public License that is incorporated pursuant to the | |||
|
594 | following paragraph. | |||
|
595 | ||||
|
596 | Notwithstanding any other provision of this License, you have | |||
|
597 | permission to link or combine any covered work with a work licensed | |||
|
598 | under version 3 of the GNU General Public License into a single | |||
|
599 | combined work, and to convey the resulting work. The terms of this | |||
|
600 | License will continue to apply to the part which is the covered work, | |||
|
601 | but the work with which it is combined will remain governed by version | |||
|
602 | 3 of the GNU General Public License. | |||
|
603 | ||||
|
604 | 14. Revised Versions of this License. | |||
|
605 | ||||
|
606 | The Free Software Foundation may publish revised and/or new versions of | |||
|
607 | the GNU Affero General Public License from time to time. Such new versions | |||
|
608 | will be similar in spirit to the present version, but may differ in detail to | |||
|
609 | address new problems or concerns. | |||
|
610 | ||||
|
611 | Each version is given a distinguishing version number. If the | |||
|
612 | Program specifies that a certain numbered version of the GNU Affero General | |||
|
613 | Public License "or any later version" applies to it, you have the | |||
|
614 | option of following the terms and conditions either of that numbered | |||
|
615 | version or of any later version published by the Free Software | |||
|
616 | Foundation. If the Program does not specify a version number of the | |||
|
617 | GNU Affero General Public License, you may choose any version ever published | |||
|
618 | by the Free Software Foundation. | |||
|
619 | ||||
|
620 | If the Program specifies that a proxy can decide which future | |||
|
621 | versions of the GNU Affero General Public License can be used, that proxy's | |||
|
622 | public statement of acceptance of a version permanently authorizes you | |||
|
623 | to choose that version for the Program. | |||
|
624 | ||||
|
625 | Later license versions may give you additional or different | |||
|
626 | permissions. However, no additional obligations are imposed on any | |||
|
627 | author or copyright holder as a result of your choosing to follow a | |||
|
628 | later version. | |||
|
629 | ||||
|
630 | 15. Disclaimer of Warranty. | |||
|
631 | ||||
|
632 | THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY | |||
|
633 | APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT | |||
|
634 | HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY | |||
|
635 | OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, | |||
|
636 | THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR | |||
|
637 | PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM | |||
|
638 | IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF | |||
|
639 | ALL NECESSARY SERVICING, REPAIR OR CORRECTION. | |||
|
640 | ||||
|
641 | 16. Limitation of Liability. | |||
|
642 | ||||
|
643 | IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING | |||
|
644 | WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS | |||
|
645 | THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY | |||
|
646 | GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE | |||
|
647 | USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF | |||
|
648 | DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD | |||
|
649 | PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), | |||
|
650 | EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF | |||
|
651 | SUCH DAMAGES. | |||
|
652 | ||||
|
653 | 17. Interpretation of Sections 15 and 16. | |||
|
654 | ||||
|
655 | If the disclaimer of warranty and limitation of liability provided | |||
|
656 | above cannot be given local legal effect according to their terms, | |||
|
657 | reviewing courts shall apply local law that most closely approximates | |||
|
658 | an absolute waiver of all civil liability in connection with the | |||
|
659 | Program, unless a warranty or assumption of liability accompanies a | |||
|
660 | copy of the Program in return for a fee. | |||
|
661 | ||||
|
662 | END OF TERMS AND CONDITIONS | |||
|
663 | ||||
|
664 | How to Apply These Terms to Your New Programs | |||
|
665 | ||||
|
666 | If you develop a new program, and you want it to be of the greatest | |||
|
667 | possible use to the public, the best way to achieve this is to make it | |||
|
668 | free software which everyone can redistribute and change under these terms. | |||
|
669 | ||||
|
670 | To do so, attach the following notices to the program. It is safest | |||
|
671 | to attach them to the start of each source file to most effectively | |||
|
672 | state the exclusion of warranty; and each file should have at least | |||
|
673 | the "copyright" line and a pointer to where the full notice is found. | |||
|
674 | ||||
|
675 | <one line to give the program's name and a brief idea of what it does.> | |||
|
676 | Copyright (C) <year> <name of author> | |||
|
677 | ||||
|
678 | This program is free software: you can redistribute it and/or modify | |||
|
679 | it under the terms of the GNU Affero General Public License as published by | |||
|
680 | the Free Software Foundation, either version 3 of the License, or | |||
|
681 | (at your option) any later version. | |||
|
682 | ||||
|
683 | This program is distributed in the hope that it will be useful, | |||
|
684 | but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
685 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
686 | GNU Affero General Public License for more details. | |||
|
687 | ||||
|
688 | You should have received a copy of the GNU Affero General Public License | |||
|
689 | along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
690 | ||||
|
691 | Also add information on how to contact you by electronic and paper mail. | |||
|
692 | ||||
|
693 | If your software can interact with users remotely through a computer | |||
|
694 | network, you should also make sure that it provides a way for users to | |||
|
695 | get its source. For example, if your program is a web application, its | |||
|
696 | interface could display a "Source" link that leads users to an archive | |||
|
697 | of the code. There are many ways you could offer source, and different | |||
|
698 | solutions will be better for different programs; see section 13 for the | |||
|
699 | specific requirements. | |||
|
700 | ||||
|
701 | You should also get your employer (if you work as a programmer) or school, | |||
|
702 | if any, to sign a "copyright disclaimer" for the program, if necessary. | |||
|
703 | For more information on this, and how to apply and follow the GNU AGPL, see | |||
|
704 | <http://www.gnu.org/licenses/>. |
@@ -0,0 +1,62 b'' | |||||
|
1 | # A generic, single database configuration. | |||
|
2 | ||||
|
3 | [alembic] | |||
|
4 | # path to migration scripts | |||
|
5 | script_location = appenlight:migrations | |||
|
6 | version_table = alembic_appenlight_version | |||
|
7 | ||||
|
8 | appenlight_config_file = development.ini | |||
|
9 | ||||
|
10 | # template used to generate migration files | |||
|
11 | # file_template = %%(rev)s_%%(slug)s | |||
|
12 | ||||
|
13 | # max length of characters to apply to the | |||
|
14 | # "slug" field | |||
|
15 | #truncate_slug_length = 40 | |||
|
16 | ||||
|
17 | # set to 'true' to run the environment during | |||
|
18 | # the 'revision' command, regardless of autogenerate | |||
|
19 | # revision_environment = false | |||
|
20 | ||||
|
21 | # set to 'true' to allow .pyc and .pyo files without | |||
|
22 | # a source .py file to be detected as revisions in the | |||
|
23 | # versions/ directory | |||
|
24 | # sourceless = false | |||
|
25 | ||||
|
26 | sqlalchemy.url = postgresql://test:test@localhost/appenlight | |||
|
27 | ||||
|
28 | ||||
|
29 | # Logging configuration | |||
|
30 | [loggers] | |||
|
31 | keys = root,sqlalchemy,alembic | |||
|
32 | ||||
|
33 | [handlers] | |||
|
34 | keys = console | |||
|
35 | ||||
|
36 | [formatters] | |||
|
37 | keys = generic | |||
|
38 | ||||
|
39 | [logger_root] | |||
|
40 | level = WARN | |||
|
41 | handlers = console | |||
|
42 | qualname = | |||
|
43 | ||||
|
44 | [logger_sqlalchemy] | |||
|
45 | level = WARN | |||
|
46 | handlers = | |||
|
47 | qualname = sqlalchemy.engine | |||
|
48 | ||||
|
49 | [logger_alembic] | |||
|
50 | level = INFO | |||
|
51 | handlers = | |||
|
52 | qualname = alembic | |||
|
53 | ||||
|
54 | [handler_console] | |||
|
55 | class = StreamHandler | |||
|
56 | args = (sys.stderr,) | |||
|
57 | level = NOTSET | |||
|
58 | formatter = generic | |||
|
59 | ||||
|
60 | [formatter_generic] | |||
|
61 | format = %(levelname)-5.5s [%(name)s] %(message)s | |||
|
62 | datefmt = %H:%M:%S |
@@ -0,0 +1,2 b'' | |||||
|
1 | include *.txt *.ini *.cfg *.rst *.md | |||
|
2 | recursive-include appenlight *.ico *.png *.css *.gif *.jpg *.pt *.txt *.mak *.mako *.js *.html *.xml *.jinja2 *.rst *.otf *.ttf *.svg *.woff *.eot |
@@ -0,0 +1,39 b'' | |||||
|
1 | # appenlight README | |||
|
2 | ||||
|
3 | ||||
|
4 | To run the app you need to have meet prerequusites: | |||
|
5 | ||||
|
6 | - running elasticsearch (2.2 tested) | |||
|
7 | - running postgresql 9.5 | |||
|
8 | - running redis | |||
|
9 | ||||
|
10 | # Setup basics | |||
|
11 | ||||
|
12 | Set up the basic application database schema: | |||
|
13 | ||||
|
14 | appenlight_initialize_db config.ini | |||
|
15 | ||||
|
16 | Set up basic elasticsearch schema: | |||
|
17 | ||||
|
18 | appenlight-reindex-elasticsearch -c config.ini -t all | |||
|
19 | ||||
|
20 | ||||
|
21 | # Running | |||
|
22 | ||||
|
23 | To run the application itself: | |||
|
24 | ||||
|
25 | pserve --reload development.ini | |||
|
26 | ||||
|
27 | To run celery queue processing: | |||
|
28 | ||||
|
29 | celery worker -A appenlight.celery -Q "reports,logs,metrics,default" --ini=development.ini | |||
|
30 | ||||
|
31 | ||||
|
32 | # Testing | |||
|
33 | ||||
|
34 | To run test suite: | |||
|
35 | ||||
|
36 | py.test appenlight/tests/tests.py --cov appenlight (this looks for testing.ini in repo root) | |||
|
37 | ||||
|
38 | WARNING!!! | |||
|
39 | Some tests will insert data into elasticsearch or redis based on testing.ini |
@@ -0,0 +1,49 b'' | |||||
|
1 | repoze.sendmail==4.1 | |||
|
2 | pyramid==1.7 | |||
|
3 | pyramid_tm==0.12 | |||
|
4 | pyramid_debugtoolbar | |||
|
5 | pyramid_authstack==1.0.1 | |||
|
6 | SQLAlchemy==1.0.12 | |||
|
7 | alembic==0.8.6 | |||
|
8 | webhelpers2==2.0 | |||
|
9 | transaction==1.4.3 | |||
|
10 | zope.sqlalchemy==0.7.6 | |||
|
11 | pyramid_mailer==0.14.1 | |||
|
12 | redis==2.10.5 | |||
|
13 | redlock-py==1.0.8 | |||
|
14 | pyramid_jinja2==2.6.2 | |||
|
15 | psycopg2==2.6.1 | |||
|
16 | wtforms==2.1 | |||
|
17 | celery==3.1.23 | |||
|
18 | formencode==1.3.0 | |||
|
19 | psutil==2.1.2 | |||
|
20 | ziggurat_foundations>=0.6.7 | |||
|
21 | bcrypt==2.0.0 | |||
|
22 | appenlight_client | |||
|
23 | markdown==2.5 | |||
|
24 | colander==1.2 | |||
|
25 | defusedxml==0.4.1 | |||
|
26 | dogpile.cache==0.5.7 | |||
|
27 | pyramid_redis_sessions==1.0.1 | |||
|
28 | simplejson==3.8.2 | |||
|
29 | waitress==0.9.0 | |||
|
30 | gunicorn==19.4.5 | |||
|
31 | requests==2.9.1 | |||
|
32 | requests_oauthlib==0.6.1 | |||
|
33 | gevent==1.1.1 | |||
|
34 | gevent-websocket==0.9.5 | |||
|
35 | pygments==2.1.3 | |||
|
36 | lxml==3.6.0 | |||
|
37 | paginate==0.5.4 | |||
|
38 | paginate-sqlalchemy==0.2.0 | |||
|
39 | pyelasticsearch==1.4 | |||
|
40 | six==1.8.0 | |||
|
41 | mock==1.0.1 | |||
|
42 | itsdangerous==0.24 | |||
|
43 | camplight==0.9.6 | |||
|
44 | jira==0.41 | |||
|
45 | python-dateutil==2.5.3 | |||
|
46 | authomatic==0.1.0.post1 | |||
|
47 | cryptography==1.2.3 | |||
|
48 | webassets==0.11.1 | |||
|
49 |
@@ -0,0 +1,60 b'' | |||||
|
1 | import os | |||
|
2 | import sys | |||
|
3 | import re | |||
|
4 | ||||
|
5 | from setuptools import setup, find_packages | |||
|
6 | ||||
|
7 | here = os.path.abspath(os.path.dirname(__file__)) | |||
|
8 | README = open(os.path.join(here, 'README.md')).read() | |||
|
9 | CHANGES = open(os.path.join(here, 'CHANGES.txt')).read() | |||
|
10 | ||||
|
11 | REQUIREMENTS = open(os.path.join(here, 'requirements.txt')).readlines() | |||
|
12 | ||||
|
13 | compiled = re.compile('([^=><]*).*') | |||
|
14 | ||||
|
15 | ||||
|
16 | def parse_req(req): | |||
|
17 | return compiled.search(req).group(1).strip() | |||
|
18 | ||||
|
19 | ||||
|
20 | requires = [_f for _f in map(parse_req, REQUIREMENTS) if _f] | |||
|
21 | ||||
|
22 | if sys.version_info[:3] < (2, 5, 0): | |||
|
23 | requires.append('pysqlite') | |||
|
24 | ||||
|
25 | found_packages = find_packages('src') | |||
|
26 | found_packages.append('appenlight.migrations.versions') | |||
|
27 | setup(name='appenlight', | |||
|
28 | version='0.1', | |||
|
29 | description='appenlight', | |||
|
30 | long_description=README + '\n\n' + CHANGES, | |||
|
31 | classifiers=[ | |||
|
32 | "Programming Language :: Python", | |||
|
33 | "Framework :: Pylons", | |||
|
34 | "Topic :: Internet :: WWW/HTTP", | |||
|
35 | "Topic :: Internet :: WWW/HTTP :: WSGI :: Application", | |||
|
36 | ], | |||
|
37 | author='', | |||
|
38 | author_email='', | |||
|
39 | url='', | |||
|
40 | keywords='web wsgi bfg pylons pyramid', | |||
|
41 | package_dir={'': 'src'}, | |||
|
42 | packages=found_packages, | |||
|
43 | include_package_data=True, | |||
|
44 | zip_safe=False, | |||
|
45 | test_suite='appenlight', | |||
|
46 | install_requires=requires, | |||
|
47 | entry_points={ | |||
|
48 | 'paste.app_factory': [ | |||
|
49 | 'main = appenlight:main' | |||
|
50 | ], | |||
|
51 | 'console_scripts': [ | |||
|
52 | 'appenlight-cleanup = appenlight.scripts.cleanup:main', | |||
|
53 | 'appenlight-initializedb = appenlight.scripts.initialize_db:main', | |||
|
54 | 'appenlight-migratedb = appenlight.scripts.migratedb:main', | |||
|
55 | 'appenlight-reindex-elasticsearch = appenlight.scripts.reindex_elasticsearch:main', | |||
|
56 | 'appenlight-static = appenlight.scripts.static:main', | |||
|
57 | 'appenlight-make-config = appenlight.scripts.make_config:main', | |||
|
58 | ] | |||
|
59 | } | |||
|
60 | ) |
@@ -0,0 +1,285 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import datetime | |||
|
23 | import logging | |||
|
24 | import pyelasticsearch | |||
|
25 | import redis | |||
|
26 | import os | |||
|
27 | from pkg_resources import iter_entry_points | |||
|
28 | ||||
|
29 | import appenlight.lib.jinja2_filters as jinja2_filters | |||
|
30 | import appenlight.lib.encryption as encryption | |||
|
31 | ||||
|
32 | from authomatic.providers import oauth2, oauth1 | |||
|
33 | from authomatic import Authomatic | |||
|
34 | from pyramid.config import Configurator, PHASE3_CONFIG | |||
|
35 | from pyramid.authentication import AuthTktAuthenticationPolicy | |||
|
36 | from pyramid.authorization import ACLAuthorizationPolicy | |||
|
37 | from pyramid_mailer.mailer import Mailer | |||
|
38 | from pyramid.renderers import JSON | |||
|
39 | from pyramid_redis_sessions import session_factory_from_settings | |||
|
40 | from pyramid.settings import asbool, aslist | |||
|
41 | from pyramid.security import AllPermissionsList | |||
|
42 | from pyramid_authstack import AuthenticationStackPolicy | |||
|
43 | from redlock import Redlock | |||
|
44 | from sqlalchemy import engine_from_config | |||
|
45 | ||||
|
46 | from appenlight.celery import configure_celery | |||
|
47 | from appenlight.lib import cache_regions | |||
|
48 | from appenlight.lib.ext_json import json | |||
|
49 | from appenlight.security import groupfinder, AuthTokenAuthenticationPolicy | |||
|
50 | ||||
|
51 | json_renderer = JSON(serializer=json.dumps, indent=4) | |||
|
52 | ||||
|
53 | log = logging.getLogger(__name__) | |||
|
54 | ||||
|
55 | ||||
|
56 | def datetime_adapter(obj, request): | |||
|
57 | return obj.isoformat() | |||
|
58 | ||||
|
59 | ||||
|
60 | def all_permissions_adapter(obj, request): | |||
|
61 | return '__all_permissions__' | |||
|
62 | ||||
|
63 | ||||
|
64 | json_renderer.add_adapter(datetime.datetime, datetime_adapter) | |||
|
65 | json_renderer.add_adapter(AllPermissionsList, all_permissions_adapter) | |||
|
66 | ||||
|
67 | ||||
|
68 | def main(global_config, **settings): | |||
|
69 | """ This function returns a Pyramid WSGI application. | |||
|
70 | """ | |||
|
71 | auth_tkt_policy = AuthTktAuthenticationPolicy( | |||
|
72 | settings['authtkt.secret'], | |||
|
73 | hashalg='sha512', | |||
|
74 | callback=groupfinder, | |||
|
75 | max_age=2592000, | |||
|
76 | secure=asbool(settings.get('authtkt.secure', 'false'))) | |||
|
77 | auth_token_policy = AuthTokenAuthenticationPolicy( | |||
|
78 | callback=groupfinder | |||
|
79 | ) | |||
|
80 | authorization_policy = ACLAuthorizationPolicy() | |||
|
81 | authentication_policy = AuthenticationStackPolicy() | |||
|
82 | authentication_policy.add_policy('auth_tkt', auth_tkt_policy) | |||
|
83 | authentication_policy.add_policy('auth_token', auth_token_policy) | |||
|
84 | # set crypto key | |||
|
85 | encryption.ENCRYPTION_SECRET = settings.get('encryption_secret') | |||
|
86 | # import this later so encyption key can be monkeypatched | |||
|
87 | from appenlight.models import DBSession, register_datastores | |||
|
88 | # update config with cometd info | |||
|
89 | settings['cometd_servers'] = {'server': settings['cometd.server'], | |||
|
90 | 'secret': settings['cometd.secret']} | |||
|
91 | ||||
|
92 | # Create the Pyramid Configurator. | |||
|
93 | settings['_mail_url'] = settings['mailing.app_url'] | |||
|
94 | config = Configurator(settings=settings, | |||
|
95 | authentication_policy=authentication_policy, | |||
|
96 | authorization_policy=authorization_policy, | |||
|
97 | root_factory='appenlight.security.RootFactory', | |||
|
98 | default_permission='view') | |||
|
99 | config.set_default_csrf_options(require_csrf=True, header='X-XSRF-TOKEN') | |||
|
100 | config.add_view_deriver('appenlight.predicates.csrf_view', | |||
|
101 | name='csrf_view') | |||
|
102 | ||||
|
103 | ||||
|
104 | # later, when config is available | |||
|
105 | dogpile_config = {'url': settings['redis.url'], | |||
|
106 | "redis_expiration_time": 86400, | |||
|
107 | "redis_distributed_lock": True} | |||
|
108 | cache_regions.regions = cache_regions.CacheRegions(dogpile_config) | |||
|
109 | config.registry.cache_regions = cache_regions.regions | |||
|
110 | engine = engine_from_config(settings, 'sqlalchemy.', | |||
|
111 | json_serializer=json.dumps) | |||
|
112 | DBSession.configure(bind=engine) | |||
|
113 | ||||
|
114 | # json rederer that serializes datetime | |||
|
115 | config.add_renderer('json', json_renderer) | |||
|
116 | config.set_request_property('appenlight.lib.request.es_conn', 'es_conn') | |||
|
117 | config.set_request_property('appenlight.lib.request.get_user', 'user', | |||
|
118 | reify=True) | |||
|
119 | config.set_request_property('appenlight.lib.request.get_csrf_token', | |||
|
120 | 'csrf_token', reify=True) | |||
|
121 | config.set_request_property('appenlight.lib.request.safe_json_body', | |||
|
122 | 'safe_json_body', reify=True) | |||
|
123 | config.set_request_property('appenlight.lib.request.unsafe_json_body', | |||
|
124 | 'unsafe_json_body', reify=True) | |||
|
125 | config.add_request_method('appenlight.lib.request.add_flash_to_headers', | |||
|
126 | 'add_flash_to_headers') | |||
|
127 | ||||
|
128 | config.include('pyramid_redis_sessions') | |||
|
129 | config.include('pyramid_tm') | |||
|
130 | config.include('pyramid_jinja2') | |||
|
131 | config.include('appenlight_client.ext.pyramid_tween') | |||
|
132 | config.include('ziggurat_foundations.ext.pyramid.sign_in') | |||
|
133 | config.registry.es_conn = pyelasticsearch.ElasticSearch( | |||
|
134 | settings['elasticsearch.nodes']) | |||
|
135 | config.registry.redis_conn = redis.StrictRedis.from_url( | |||
|
136 | settings['redis.url']) | |||
|
137 | ||||
|
138 | config.registry.redis_lockmgr = Redlock([settings['redis.redlock.url'], ], | |||
|
139 | retry_count=0, retry_delay=0) | |||
|
140 | # mailer | |||
|
141 | config.registry.mailer = Mailer.from_settings(settings) | |||
|
142 | ||||
|
143 | # Configure sessions | |||
|
144 | session_factory = session_factory_from_settings(settings) | |||
|
145 | config.set_session_factory(session_factory) | |||
|
146 | ||||
|
147 | # Configure renderers and event subscribers | |||
|
148 | config.add_jinja2_extension('jinja2.ext.loopcontrols') | |||
|
149 | config.add_jinja2_search_path('appenlight:templates') | |||
|
150 | # event subscribers | |||
|
151 | config.add_subscriber("appenlight.subscribers.application_created", | |||
|
152 | "pyramid.events.ApplicationCreated") | |||
|
153 | config.add_subscriber("appenlight.subscribers.add_renderer_globals", | |||
|
154 | "pyramid.events.BeforeRender") | |||
|
155 | config.add_subscriber('appenlight.subscribers.new_request', | |||
|
156 | 'pyramid.events.NewRequest') | |||
|
157 | config.add_view_predicate('context_type_class', | |||
|
158 | 'appenlight.predicates.contextTypeClass') | |||
|
159 | ||||
|
160 | register_datastores(es_conn=config.registry.es_conn, | |||
|
161 | redis_conn=config.registry.redis_conn, | |||
|
162 | redis_lockmgr=config.registry.redis_lockmgr) | |||
|
163 | ||||
|
164 | # base stuff and scan | |||
|
165 | ||||
|
166 | # need to ensure webassets exists otherwise config.override_asset() | |||
|
167 | # throws exception | |||
|
168 | if not os.path.exists(settings['webassets.dir']): | |||
|
169 | os.mkdir(settings['webassets.dir']) | |||
|
170 | config.add_static_view(path='appenlight:webassets', | |||
|
171 | name='static', cache_max_age=3600) | |||
|
172 | config.override_asset(to_override='appenlight:webassets/', | |||
|
173 | override_with=settings['webassets.dir']) | |||
|
174 | ||||
|
175 | config.include('appenlight.views') | |||
|
176 | config.include('appenlight.views.admin') | |||
|
177 | config.scan(ignore=['appenlight.migrations', | |||
|
178 | 'appenlight.scripts', | |||
|
179 | 'appenlight.tests']) | |||
|
180 | ||||
|
181 | # authomatic social auth | |||
|
182 | authomatic_conf = { | |||
|
183 | # callback http://yourapp.com/social_auth/twitter | |||
|
184 | 'twitter': { | |||
|
185 | 'class_': oauth1.Twitter, | |||
|
186 | 'consumer_key': settings.get('authomatic.pr.twitter.key', 'X'), | |||
|
187 | 'consumer_secret': settings.get('authomatic.pr.twitter.secret', | |||
|
188 | 'X'), | |||
|
189 | }, | |||
|
190 | # callback http://yourapp.com/social_auth/facebook | |||
|
191 | 'facebook': { | |||
|
192 | 'class_': oauth2.Facebook, | |||
|
193 | 'consumer_key': settings.get('authomatic.pr.facebook.app_id', 'X'), | |||
|
194 | 'consumer_secret': settings.get('authomatic.pr.facebook.secret', | |||
|
195 | 'X'), | |||
|
196 | 'scope': ['email'], | |||
|
197 | }, | |||
|
198 | # callback http://yourapp.com/social_auth/google | |||
|
199 | 'google': { | |||
|
200 | 'class_': oauth2.Google, | |||
|
201 | 'consumer_key': settings.get('authomatic.pr.google.key', 'X'), | |||
|
202 | 'consumer_secret': settings.get( | |||
|
203 | 'authomatic.pr.google.secret', 'X'), | |||
|
204 | 'scope': ['profile', 'email'], | |||
|
205 | }, | |||
|
206 | 'github': { | |||
|
207 | 'class_': oauth2.GitHub, | |||
|
208 | 'consumer_key': settings.get('authomatic.pr.github.key', 'X'), | |||
|
209 | 'consumer_secret': settings.get( | |||
|
210 | 'authomatic.pr.github.secret', 'X'), | |||
|
211 | 'scope': ['repo', 'public_repo', 'user:email'], | |||
|
212 | 'access_headers': {'User-Agent': 'AppEnlight'}, | |||
|
213 | }, | |||
|
214 | 'bitbucket': { | |||
|
215 | 'class_': oauth1.Bitbucket, | |||
|
216 | 'consumer_key': settings.get('authomatic.pr.bitbucket.key', 'X'), | |||
|
217 | 'consumer_secret': settings.get( | |||
|
218 | 'authomatic.pr.bitbucket.secret', 'X') | |||
|
219 | } | |||
|
220 | } | |||
|
221 | config.registry.authomatic = Authomatic( | |||
|
222 | config=authomatic_conf, secret=settings['authomatic.secret']) | |||
|
223 | ||||
|
224 | # resource type information | |||
|
225 | config.registry.resource_types = ['resource', 'application'] | |||
|
226 | ||||
|
227 | # plugin information | |||
|
228 | config.registry.appenlight_plugins = {} | |||
|
229 | ||||
|
230 | def register_appenlight_plugin(config, plugin_name, plugin_config): | |||
|
231 | def register(): | |||
|
232 | log.warning('Registering plugin: {}'.format(plugin_name)) | |||
|
233 | if plugin_name not in config.registry.appenlight_plugins: | |||
|
234 | config.registry.appenlight_plugins[plugin_name] = { | |||
|
235 | 'javascript': None, | |||
|
236 | 'static': None, | |||
|
237 | 'css': None, | |||
|
238 | 'top_nav': None, | |||
|
239 | 'celery_tasks': None, | |||
|
240 | 'celery_beats': None, | |||
|
241 | 'fulltext_indexer': None, | |||
|
242 | 'sqlalchemy_migrations': None, | |||
|
243 | 'default_values_setter': None, | |||
|
244 | 'resource_types': [], | |||
|
245 | 'url_gen': None | |||
|
246 | } | |||
|
247 | config.registry.appenlight_plugins[plugin_name].update( | |||
|
248 | plugin_config) | |||
|
249 | # inform AE what kind of resource types we have available | |||
|
250 | # so we can avoid failing when a plugin is removed but data | |||
|
251 | # is still present in the db | |||
|
252 | if plugin_config.get('resource_types'): | |||
|
253 | config.registry.resource_types.extend( | |||
|
254 | plugin_config['resource_types']) | |||
|
255 | ||||
|
256 | config.action('appenlight_plugin={}'.format(plugin_name), register) | |||
|
257 | ||||
|
258 | config.add_directive('register_appenlight_plugin', | |||
|
259 | register_appenlight_plugin) | |||
|
260 | ||||
|
261 | for entry_point in iter_entry_points(group='appenlight.plugins'): | |||
|
262 | plugin = entry_point.load() | |||
|
263 | plugin.includeme(config) | |||
|
264 | ||||
|
265 | # include other appenlight plugins explictly if needed | |||
|
266 | includes = aslist(settings.get('appenlight.includes', [])) | |||
|
267 | for inc in includes: | |||
|
268 | config.include(inc) | |||
|
269 | ||||
|
270 | # run this after everything registers in configurator | |||
|
271 | ||||
|
272 | def pre_commit(): | |||
|
273 | jinja_env = config.get_jinja2_environment() | |||
|
274 | jinja_env.filters['tojson'] = json.dumps | |||
|
275 | jinja_env.filters['toJSONUnsafe'] = jinja2_filters.toJSONUnsafe | |||
|
276 | ||||
|
277 | config.action(None, pre_commit, order=PHASE3_CONFIG + 999) | |||
|
278 | ||||
|
279 | def wrap_config_celery(): | |||
|
280 | configure_celery(config.registry) | |||
|
281 | ||||
|
282 | config.action(None, wrap_config_celery, order=PHASE3_CONFIG + 999) | |||
|
283 | ||||
|
284 | app = config.make_wsgi_app() | |||
|
285 | return app |
@@ -0,0 +1,161 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | ||||
|
24 | from datetime import timedelta | |||
|
25 | from celery import Celery | |||
|
26 | from celery.bin import Option | |||
|
27 | from celery.schedules import crontab | |||
|
28 | from celery.signals import worker_init, task_revoked, user_preload_options | |||
|
29 | from celery.signals import task_prerun, task_retry, task_failure, task_success | |||
|
30 | from kombu.serialization import register | |||
|
31 | from pyramid.paster import bootstrap | |||
|
32 | from pyramid.request import Request | |||
|
33 | from pyramid.scripting import prepare | |||
|
34 | from pyramid.settings import asbool | |||
|
35 | from pyramid.threadlocal import get_current_request | |||
|
36 | ||||
|
37 | from appenlight.celery.encoders import json_dumps, json_loads | |||
|
38 | from appenlight_client.ext.celery import register_signals | |||
|
39 | ||||
|
40 | log = logging.getLogger(__name__) | |||
|
41 | ||||
|
42 | register('date_json', json_dumps, json_loads, | |||
|
43 | content_type='application/x-date_json', | |||
|
44 | content_encoding='utf-8') | |||
|
45 | ||||
|
46 | celery = Celery() | |||
|
47 | ||||
|
48 | celery.user_options['preload'].add( | |||
|
49 | Option('--ini', dest='ini', default=None, | |||
|
50 | help='Specifies pyramid configuration file location.') | |||
|
51 | ) | |||
|
52 | ||||
|
53 | @user_preload_options.connect | |||
|
54 | def on_preload_parsed(options, **kwargs): | |||
|
55 | """ | |||
|
56 | This actually configures celery from pyramid config file | |||
|
57 | """ | |||
|
58 | celery.conf['INI_PYRAMID'] = options['ini'] | |||
|
59 | import appenlight_client.client as e_client | |||
|
60 | ini_location = options['ini'] | |||
|
61 | if not ini_location: | |||
|
62 | raise Exception('You need to pass pyramid ini location using ' | |||
|
63 | '--ini=filename.ini argument to the worker') | |||
|
64 | env = bootstrap(ini_location) | |||
|
65 | api_key = env['request'].registry.settings['appenlight.api_key'] | |||
|
66 | tr_config = env['request'].registry.settings.get( | |||
|
67 | 'appenlight.transport_config') | |||
|
68 | CONFIG = e_client.get_config({'appenlight.api_key': api_key}) | |||
|
69 | if tr_config: | |||
|
70 | CONFIG['appenlight.transport_config'] = tr_config | |||
|
71 | APPENLIGHT_CLIENT = e_client.Client(CONFIG) | |||
|
72 | # log.addHandler(APPENLIGHT_CLIENT.log_handler) | |||
|
73 | register_signals(APPENLIGHT_CLIENT) | |||
|
74 | celery.pyramid = env | |||
|
75 | ||||
|
76 | ||||
|
77 | celery_config = { | |||
|
78 | 'CELERY_IMPORTS': ["appenlight.celery.tasks",], | |||
|
79 | 'CELERYD_TASK_TIME_LIMIT': 60, | |||
|
80 | 'CELERYD_MAX_TASKS_PER_CHILD': 1000, | |||
|
81 | 'CELERY_IGNORE_RESULT': True, | |||
|
82 | 'CELERY_ACCEPT_CONTENT': ['date_json'], | |||
|
83 | 'CELERY_TASK_SERIALIZER': 'date_json', | |||
|
84 | 'CELERY_RESULT_SERIALIZER': 'date_json', | |||
|
85 | 'BROKER_URL': None, | |||
|
86 | 'CELERYD_CONCURRENCY': None, | |||
|
87 | 'CELERY_TIMEZONE': None, | |||
|
88 | 'CELERYBEAT_SCHEDULE': { | |||
|
89 | 'alerting': { | |||
|
90 | 'task': 'appenlight.celery.tasks.alerting', | |||
|
91 | 'schedule': timedelta(seconds=60) | |||
|
92 | }, | |||
|
93 | 'daily_digest': { | |||
|
94 | 'task': 'appenlight.celery.tasks.daily_digest', | |||
|
95 | 'schedule': crontab(minute=1, hour='4,12,20') | |||
|
96 | }, | |||
|
97 | } | |||
|
98 | } | |||
|
99 | celery.config_from_object(celery_config) | |||
|
100 | ||||
|
101 | def configure_celery(pyramid_registry): | |||
|
102 | settings = pyramid_registry.settings | |||
|
103 | celery_config['BROKER_URL'] = settings['celery.broker_url'] | |||
|
104 | celery_config['CELERYD_CONCURRENCY'] = settings['celery.concurrency'] | |||
|
105 | celery_config['CELERY_TIMEZONE'] = settings['celery.timezone'] | |||
|
106 | if asbool(settings.get('celery.always_eager')): | |||
|
107 | celery_config['CELERY_ALWAYS_EAGER'] = True | |||
|
108 | celery_config['CELERY_EAGER_PROPAGATES_EXCEPTIONS'] = True | |||
|
109 | ||||
|
110 | for plugin in pyramid_registry.appenlight_plugins.values(): | |||
|
111 | if plugin.get('celery_tasks'): | |||
|
112 | celery_config['CELERY_IMPORTS'].extend(plugin['celery_tasks']) | |||
|
113 | if plugin.get('celery_beats'): | |||
|
114 | for name, config in plugin['celery_beats']: | |||
|
115 | celery_config['CELERYBEAT_SCHEDULE'][name] = config | |||
|
116 | celery.config_from_object(celery_config) | |||
|
117 | ||||
|
118 | ||||
|
119 | @task_prerun.connect | |||
|
120 | def task_prerun_signal(task_id, task, args, kwargs, **kwaargs): | |||
|
121 | if hasattr(celery, 'pyramid'): | |||
|
122 | env = celery.pyramid | |||
|
123 | env = prepare(registry=env['request'].registry) | |||
|
124 | proper_base_url = env['request'].registry.settings['mailing.app_url'] | |||
|
125 | tmp_request = Request.blank('/', base_url=proper_base_url) | |||
|
126 | # ensure tasks generate url for right domain from config | |||
|
127 | env['request'].environ['HTTP_HOST'] = tmp_request.environ['HTTP_HOST'] | |||
|
128 | env['request'].environ['SERVER_PORT'] = tmp_request.environ['SERVER_PORT'] | |||
|
129 | env['request'].environ['SERVER_NAME'] = tmp_request.environ['SERVER_NAME'] | |||
|
130 | env['request'].environ['wsgi.url_scheme'] = tmp_request.environ[ | |||
|
131 | 'wsgi.url_scheme'] | |||
|
132 | get_current_request().tm.begin() | |||
|
133 | ||||
|
134 | ||||
|
135 | @task_success.connect | |||
|
136 | def task_success_signal(result, **kwargs): | |||
|
137 | get_current_request().tm.commit() | |||
|
138 | if hasattr(celery, 'pyramid'): | |||
|
139 | celery.pyramid["closer"]() | |||
|
140 | ||||
|
141 | ||||
|
142 | @task_retry.connect | |||
|
143 | def task_retry_signal(request, reason, einfo, **kwargs): | |||
|
144 | get_current_request().tm.abort() | |||
|
145 | if hasattr(celery, 'pyramid'): | |||
|
146 | celery.pyramid["closer"]() | |||
|
147 | ||||
|
148 | ||||
|
149 | @task_failure.connect | |||
|
150 | def task_failure_signal(task_id, exception, args, kwargs, traceback, einfo, | |||
|
151 | **kwaargs): | |||
|
152 | get_current_request().tm.abort() | |||
|
153 | if hasattr(celery, 'pyramid'): | |||
|
154 | celery.pyramid["closer"]() | |||
|
155 | ||||
|
156 | ||||
|
157 | @task_revoked.connect | |||
|
158 | def task_revoked_signal(request, terminated, signum, expired, **kwaargs): | |||
|
159 | get_current_request().tm.abort() | |||
|
160 | if hasattr(celery, 'pyramid'): | |||
|
161 | celery.pyramid["closer"]() |
@@ -0,0 +1,65 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import json | |||
|
23 | from datetime import datetime, date, timedelta | |||
|
24 | ||||
|
25 | DATE_FORMAT = '%Y-%m-%dT%H:%M:%S.%f' | |||
|
26 | ||||
|
27 | ||||
|
28 | class DateEncoder(json.JSONEncoder): | |||
|
29 | def default(self, obj): | |||
|
30 | if isinstance(obj, datetime): | |||
|
31 | return { | |||
|
32 | '__type__': '__datetime__', | |||
|
33 | 'iso': obj.strftime(DATE_FORMAT) | |||
|
34 | } | |||
|
35 | elif isinstance(obj, date): | |||
|
36 | return { | |||
|
37 | '__type__': '__date__', | |||
|
38 | 'iso': obj.strftime(DATE_FORMAT) | |||
|
39 | } | |||
|
40 | elif isinstance(obj, timedelta): | |||
|
41 | return { | |||
|
42 | '__type__': '__timedelta__', | |||
|
43 | 'seconds': obj.total_seconds() | |||
|
44 | } | |||
|
45 | else: | |||
|
46 | return json.JSONEncoder.default(self, obj) | |||
|
47 | ||||
|
48 | ||||
|
49 | def date_decoder(dct): | |||
|
50 | if '__type__' in dct: | |||
|
51 | if dct['__type__'] == '__datetime__': | |||
|
52 | return datetime.strptime(dct['iso'], DATE_FORMAT) | |||
|
53 | elif dct['__type__'] == '__date__': | |||
|
54 | return datetime.strptime(dct['iso'], DATE_FORMAT).date() | |||
|
55 | elif dct['__type__'] == '__timedelta__': | |||
|
56 | return timedelta(seconds=dct['seconds']) | |||
|
57 | return dct | |||
|
58 | ||||
|
59 | ||||
|
60 | def json_dumps(obj): | |||
|
61 | return json.dumps(obj, cls=DateEncoder) | |||
|
62 | ||||
|
63 | ||||
|
64 | def json_loads(obj): | |||
|
65 | return json.loads(obj.decode('utf8'), object_hook=date_decoder) |
This diff has been collapsed as it changes many lines, (610 lines changed) Show them Hide them | |||||
@@ -0,0 +1,610 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import bisect | |||
|
23 | import collections | |||
|
24 | import math | |||
|
25 | from datetime import datetime, timedelta | |||
|
26 | ||||
|
27 | import sqlalchemy as sa | |||
|
28 | import pyelasticsearch | |||
|
29 | ||||
|
30 | from celery.utils.log import get_task_logger | |||
|
31 | from zope.sqlalchemy import mark_changed | |||
|
32 | from pyramid.threadlocal import get_current_request, get_current_registry | |||
|
33 | from appenlight.celery import celery | |||
|
34 | from appenlight.models.report_group import ReportGroup | |||
|
35 | from appenlight.models import DBSession, Datastores | |||
|
36 | from appenlight.models.report import Report | |||
|
37 | from appenlight.models.log import Log | |||
|
38 | from appenlight.models.request_metric import Metric | |||
|
39 | from appenlight.models.event import Event | |||
|
40 | ||||
|
41 | from appenlight.models.services.application import ApplicationService | |||
|
42 | from appenlight.models.services.event import EventService | |||
|
43 | from appenlight.models.services.log import LogService | |||
|
44 | from appenlight.models.services.report import ReportService | |||
|
45 | from appenlight.models.services.report_group import ReportGroupService | |||
|
46 | from appenlight.models.services.user import UserService | |||
|
47 | from appenlight.models.tag import Tag | |||
|
48 | from appenlight.lib import print_traceback | |||
|
49 | from appenlight.lib.utils import parse_proto, in_batches | |||
|
50 | from appenlight.lib.ext_json import json | |||
|
51 | from appenlight.lib.redis_keys import REDIS_KEYS | |||
|
52 | from appenlight.lib.enums import ReportType | |||
|
53 | ||||
|
54 | log = get_task_logger(__name__) | |||
|
55 | ||||
|
56 | sample_boundries = list(range(100, 10000, 100)) | |||
|
57 | ||||
|
58 | ||||
|
59 | def pick_sample(total_occurences, report_type=1): | |||
|
60 | every = 1.0 | |||
|
61 | position = bisect.bisect_left(sample_boundries, total_occurences) | |||
|
62 | if position > 0: | |||
|
63 | # 404 | |||
|
64 | if report_type == 2: | |||
|
65 | divide = 10.0 | |||
|
66 | else: | |||
|
67 | divide = 100.0 | |||
|
68 | every = sample_boundries[position - 1] / divide | |||
|
69 | return total_occurences % every == 0 | |||
|
70 | ||||
|
71 | ||||
|
72 | @celery.task(queue="default", default_retry_delay=1, max_retries=2) | |||
|
73 | def test_exception_task(): | |||
|
74 | log.error('test celery log', extra={'location': 'celery'}) | |||
|
75 | log.warning('test celery log', extra={'location': 'celery'}) | |||
|
76 | raise Exception('Celery exception test') | |||
|
77 | ||||
|
78 | ||||
|
79 | @celery.task(queue="default", default_retry_delay=1, max_retries=2) | |||
|
80 | def test_retry_exception_task(): | |||
|
81 | try: | |||
|
82 | import time | |||
|
83 | ||||
|
84 | time.sleep(1.3) | |||
|
85 | log.error('test retry celery log', extra={'location': 'celery'}) | |||
|
86 | log.warning('test retry celery log', extra={'location': 'celery'}) | |||
|
87 | raise Exception('Celery exception test') | |||
|
88 | except Exception as exc: | |||
|
89 | test_retry_exception_task.retry(exc=exc) | |||
|
90 | ||||
|
91 | ||||
|
92 | @celery.task(queue="reports", default_retry_delay=600, max_retries=999) | |||
|
93 | def add_reports(resource_id, params, dataset, environ=None, **kwargs): | |||
|
94 | proto_version = parse_proto(params.get('protocol_version', '')) | |||
|
95 | current_time = datetime.utcnow().replace(second=0, microsecond=0) | |||
|
96 | try: | |||
|
97 | # we will store solr docs here for single insert | |||
|
98 | es_report_docs = {} | |||
|
99 | es_report_group_docs = {} | |||
|
100 | resource = ApplicationService.by_id(resource_id) | |||
|
101 | reports = [] | |||
|
102 | ||||
|
103 | if proto_version.major < 1 and proto_version.minor < 5: | |||
|
104 | for report_data in dataset: | |||
|
105 | report_details = report_data.get('report_details', []) | |||
|
106 | for i, detail_data in enumerate(report_details): | |||
|
107 | report_data.update(detail_data) | |||
|
108 | report_data.pop('report_details') | |||
|
109 | traceback = report_data.get('traceback') | |||
|
110 | if traceback is None: | |||
|
111 | report_data['traceback'] = report_data.get('frameinfo') | |||
|
112 | # for 0.3 api | |||
|
113 | error = report_data.pop('error_type', '') | |||
|
114 | if error: | |||
|
115 | report_data['error'] = error | |||
|
116 | if proto_version.minor < 4: | |||
|
117 | # convert "Unknown" slow reports to | |||
|
118 | # '' (from older clients) | |||
|
119 | if (report_data['error'] and | |||
|
120 | report_data['http_status'] < 500): | |||
|
121 | report_data['error'] = '' | |||
|
122 | message = report_data.get('message') | |||
|
123 | if 'extra' not in report_data: | |||
|
124 | report_data['extra'] = [] | |||
|
125 | if message: | |||
|
126 | report_data['extra'] = [('message', message), ] | |||
|
127 | reports.append(report_data) | |||
|
128 | else: | |||
|
129 | reports = dataset | |||
|
130 | ||||
|
131 | tags = [] | |||
|
132 | es_slow_calls_docs = {} | |||
|
133 | es_reports_stats_rows = {} | |||
|
134 | for report_data in reports: | |||
|
135 | # build report details for later | |||
|
136 | added_details = 0 | |||
|
137 | report = Report() | |||
|
138 | report.set_data(report_data, resource, proto_version) | |||
|
139 | report._skip_ft_index = True | |||
|
140 | ||||
|
141 | report_group = ReportGroupService.by_hash_and_resource( | |||
|
142 | report.resource_id, | |||
|
143 | report.grouping_hash | |||
|
144 | ) | |||
|
145 | occurences = report_data.get('occurences', 1) | |||
|
146 | if not report_group: | |||
|
147 | # total reports will be +1 moment later | |||
|
148 | report_group = ReportGroup(grouping_hash=report.grouping_hash, | |||
|
149 | occurences=0, total_reports=0, | |||
|
150 | last_report=0, | |||
|
151 | priority=report.priority, | |||
|
152 | error=report.error, | |||
|
153 | first_timestamp=report.start_time) | |||
|
154 | report_group._skip_ft_index = True | |||
|
155 | report_group.report_type = report.report_type | |||
|
156 | report.report_group_time = report_group.first_timestamp | |||
|
157 | add_sample = pick_sample(report_group.occurences, | |||
|
158 | report_type=report_group.report_type) | |||
|
159 | if add_sample: | |||
|
160 | resource.report_groups.append(report_group) | |||
|
161 | report_group.reports.append(report) | |||
|
162 | added_details += 1 | |||
|
163 | DBSession.flush() | |||
|
164 | if report.partition_id not in es_report_docs: | |||
|
165 | es_report_docs[report.partition_id] = [] | |||
|
166 | es_report_docs[report.partition_id].append(report.es_doc()) | |||
|
167 | tags.extend(list(report.tags.items())) | |||
|
168 | slow_calls = report.add_slow_calls(report_data, report_group) | |||
|
169 | DBSession.flush() | |||
|
170 | for s_call in slow_calls: | |||
|
171 | if s_call.partition_id not in es_slow_calls_docs: | |||
|
172 | es_slow_calls_docs[s_call.partition_id] = [] | |||
|
173 | es_slow_calls_docs[s_call.partition_id].append( | |||
|
174 | s_call.es_doc()) | |||
|
175 | # try generating new stat rows if needed | |||
|
176 | else: | |||
|
177 | # required for postprocessing to not fail later | |||
|
178 | report.report_group = report_group | |||
|
179 | ||||
|
180 | stat_row = ReportService.generate_stat_rows( | |||
|
181 | report, resource, report_group) | |||
|
182 | if stat_row.partition_id not in es_reports_stats_rows: | |||
|
183 | es_reports_stats_rows[stat_row.partition_id] = [] | |||
|
184 | es_reports_stats_rows[stat_row.partition_id].append( | |||
|
185 | stat_row.es_doc()) | |||
|
186 | ||||
|
187 | # see if we should mark 10th occurence of report | |||
|
188 | last_occurences_10 = int(math.floor(report_group.occurences / 10)) | |||
|
189 | curr_occurences_10 = int(math.floor( | |||
|
190 | (report_group.occurences + report.occurences) / 10)) | |||
|
191 | last_occurences_100 = int( | |||
|
192 | math.floor(report_group.occurences / 100)) | |||
|
193 | curr_occurences_100 = int(math.floor( | |||
|
194 | (report_group.occurences + report.occurences) / 100)) | |||
|
195 | notify_occurences_10 = last_occurences_10 != curr_occurences_10 | |||
|
196 | notify_occurences_100 = last_occurences_100 != curr_occurences_100 | |||
|
197 | report_group.occurences = ReportGroup.occurences + occurences | |||
|
198 | report_group.last_timestamp = report.start_time | |||
|
199 | report_group.summed_duration = ReportGroup.summed_duration + report.duration | |||
|
200 | summed_duration = ReportGroup.summed_duration + report.duration | |||
|
201 | summed_occurences = ReportGroup.occurences + occurences | |||
|
202 | report_group.average_duration = summed_duration / summed_occurences | |||
|
203 | report_group.run_postprocessing(report) | |||
|
204 | if added_details: | |||
|
205 | report_group.total_reports = ReportGroup.total_reports + 1 | |||
|
206 | report_group.last_report = report.id | |||
|
207 | report_group.set_notification_info(notify_10=notify_occurences_10, | |||
|
208 | notify_100=notify_occurences_100) | |||
|
209 | DBSession.flush() | |||
|
210 | report_group.get_report().notify_channel(report_group) | |||
|
211 | if report_group.partition_id not in es_report_group_docs: | |||
|
212 | es_report_group_docs[report_group.partition_id] = [] | |||
|
213 | es_report_group_docs[report_group.partition_id].append( | |||
|
214 | report_group.es_doc()) | |||
|
215 | ||||
|
216 | action = 'REPORT' | |||
|
217 | log_msg = '%s: %s %s, client: %s, proto: %s' % ( | |||
|
218 | action, | |||
|
219 | report_data.get('http_status', 'unknown'), | |||
|
220 | str(resource), | |||
|
221 | report_data.get('client'), | |||
|
222 | proto_version) | |||
|
223 | log.info(log_msg) | |||
|
224 | total_reports = len(dataset) | |||
|
225 | key = REDIS_KEYS['counters']['reports_per_minute'].format(current_time) | |||
|
226 | Datastores.redis.incr(key, total_reports) | |||
|
227 | Datastores.redis.expire(key, 3600 * 24) | |||
|
228 | key = REDIS_KEYS['counters']['reports_per_minute_per_app'].format( | |||
|
229 | resource_id, current_time) | |||
|
230 | Datastores.redis.incr(key, total_reports) | |||
|
231 | Datastores.redis.expire(key, 3600 * 24) | |||
|
232 | ||||
|
233 | add_reports_es(es_report_group_docs, es_report_docs) | |||
|
234 | add_reports_slow_calls_es(es_slow_calls_docs) | |||
|
235 | add_reports_stats_rows_es(es_reports_stats_rows) | |||
|
236 | return True | |||
|
237 | except Exception as exc: | |||
|
238 | print_traceback(log) | |||
|
239 | add_reports.retry(exc=exc) | |||
|
240 | ||||
|
241 | ||||
|
242 | @celery.task(queue="es", default_retry_delay=600, max_retries=999) | |||
|
243 | def add_reports_es(report_group_docs, report_docs): | |||
|
244 | for k, v in report_group_docs.items(): | |||
|
245 | Datastores.es.bulk_index(k, 'report_group', v, id_field="_id") | |||
|
246 | for k, v in report_docs.items(): | |||
|
247 | Datastores.es.bulk_index(k, 'report', v, id_field="_id", | |||
|
248 | parent_field='_parent') | |||
|
249 | ||||
|
250 | ||||
|
251 | @celery.task(queue="es", default_retry_delay=600, max_retries=999) | |||
|
252 | def add_reports_slow_calls_es(es_docs): | |||
|
253 | for k, v in es_docs.items(): | |||
|
254 | Datastores.es.bulk_index(k, 'log', v) | |||
|
255 | ||||
|
256 | ||||
|
257 | @celery.task(queue="es", default_retry_delay=600, max_retries=999) | |||
|
258 | def add_reports_stats_rows_es(es_docs): | |||
|
259 | for k, v in es_docs.items(): | |||
|
260 | Datastores.es.bulk_index(k, 'log', v) | |||
|
261 | ||||
|
262 | ||||
|
263 | @celery.task(queue="logs", default_retry_delay=600, max_retries=999) | |||
|
264 | def add_logs(resource_id, request, dataset, environ=None, **kwargs): | |||
|
265 | proto_version = request.get('protocol_version') | |||
|
266 | current_time = datetime.utcnow().replace(second=0, microsecond=0) | |||
|
267 | ||||
|
268 | try: | |||
|
269 | es_docs = collections.defaultdict(list) | |||
|
270 | application = ApplicationService.by_id(resource_id) | |||
|
271 | ns_pairs = [] | |||
|
272 | for entry in dataset: | |||
|
273 | # gather pk and ns so we can remove older versions of row later | |||
|
274 | if entry['primary_key'] is not None: | |||
|
275 | ns_pairs.append({"pk": entry['primary_key'], | |||
|
276 | "ns": entry['namespace']}) | |||
|
277 | log_entry = Log() | |||
|
278 | log_entry.set_data(entry, resource=application) | |||
|
279 | log_entry._skip_ft_index = True | |||
|
280 | application.logs.append(log_entry) | |||
|
281 | DBSession.flush() | |||
|
282 | # insert non pk rows first | |||
|
283 | if entry['primary_key'] is None: | |||
|
284 | es_docs[log_entry.partition_id].append(log_entry.es_doc()) | |||
|
285 | ||||
|
286 | # 2nd pass to delete all log entries from db foe same pk/ns pair | |||
|
287 | if ns_pairs: | |||
|
288 | ids_to_delete = [] | |||
|
289 | es_docs = collections.defaultdict(list) | |||
|
290 | es_docs_to_delete = collections.defaultdict(list) | |||
|
291 | found_pkey_logs = LogService.query_by_primary_key_and_namespace( | |||
|
292 | list_of_pairs=ns_pairs) | |||
|
293 | log_dict = {} | |||
|
294 | for log_entry in found_pkey_logs: | |||
|
295 | log_key = (log_entry.primary_key, log_entry.namespace) | |||
|
296 | if log_key not in log_dict: | |||
|
297 | log_dict[log_key] = [] | |||
|
298 | log_dict[log_key].append(log_entry) | |||
|
299 | ||||
|
300 | for ns, entry_list in log_dict.items(): | |||
|
301 | entry_list = sorted(entry_list, key=lambda x: x.timestamp) | |||
|
302 | # newest row needs to be indexed in es | |||
|
303 | log_entry = entry_list[-1] | |||
|
304 | # delete everything from pg and ES, leave the last row in pg | |||
|
305 | for e in entry_list[:-1]: | |||
|
306 | ids_to_delete.append(e.log_id) | |||
|
307 | es_docs_to_delete[e.partition_id].append(e.delete_hash) | |||
|
308 | ||||
|
309 | es_docs_to_delete[log_entry.partition_id].append( | |||
|
310 | log_entry.delete_hash) | |||
|
311 | ||||
|
312 | es_docs[log_entry.partition_id].append(log_entry.es_doc()) | |||
|
313 | ||||
|
314 | if ids_to_delete: | |||
|
315 | query = DBSession.query(Log).filter( | |||
|
316 | Log.log_id.in_(ids_to_delete)) | |||
|
317 | query.delete(synchronize_session=False) | |||
|
318 | if es_docs_to_delete: | |||
|
319 | # batch this to avoid problems with default ES bulk limits | |||
|
320 | for es_index in es_docs_to_delete.keys(): | |||
|
321 | for batch in in_batches(es_docs_to_delete[es_index], 20): | |||
|
322 | query = {'terms': {'delete_hash': batch}} | |||
|
323 | ||||
|
324 | try: | |||
|
325 | Datastores.es.delete_by_query( | |||
|
326 | es_index, 'log', query) | |||
|
327 | except pyelasticsearch.ElasticHttpNotFoundError as exc: | |||
|
328 | log.error(exc) | |||
|
329 | ||||
|
330 | total_logs = len(dataset) | |||
|
331 | ||||
|
332 | log_msg = 'LOG_NEW: %s, entries: %s, proto:%s' % ( | |||
|
333 | str(application), | |||
|
334 | total_logs, | |||
|
335 | proto_version) | |||
|
336 | log.info(log_msg) | |||
|
337 | # mark_changed(session) | |||
|
338 | key = REDIS_KEYS['counters']['logs_per_minute'].format(current_time) | |||
|
339 | Datastores.redis.incr(key, total_logs) | |||
|
340 | Datastores.redis.expire(key, 3600 * 24) | |||
|
341 | key = REDIS_KEYS['counters']['logs_per_minute_per_app'].format( | |||
|
342 | resource_id, current_time) | |||
|
343 | Datastores.redis.incr(key, total_logs) | |||
|
344 | Datastores.redis.expire(key, 3600 * 24) | |||
|
345 | add_logs_es(es_docs) | |||
|
346 | return True | |||
|
347 | except Exception as exc: | |||
|
348 | print_traceback(log) | |||
|
349 | add_logs.retry(exc=exc) | |||
|
350 | ||||
|
351 | ||||
|
352 | @celery.task(queue="es", default_retry_delay=600, max_retries=999) | |||
|
353 | def add_logs_es(es_docs): | |||
|
354 | for k, v in es_docs.items(): | |||
|
355 | Datastores.es.bulk_index(k, 'log', v) | |||
|
356 | ||||
|
357 | ||||
|
358 | @celery.task(queue="metrics", default_retry_delay=600, max_retries=999) | |||
|
359 | def add_metrics(resource_id, request, dataset, proto_version): | |||
|
360 | current_time = datetime.utcnow().replace(second=0, microsecond=0) | |||
|
361 | try: | |||
|
362 | application = ApplicationService.by_id_cached()(resource_id) | |||
|
363 | application = DBSession.merge(application, load=False) | |||
|
364 | es_docs = [] | |||
|
365 | rows = [] | |||
|
366 | for metric in dataset: | |||
|
367 | tags = dict(metric['tags']) | |||
|
368 | server_n = tags.get('server_name', metric['server_name']).lower() | |||
|
369 | tags['server_name'] = server_n or 'unknown' | |||
|
370 | new_metric = Metric( | |||
|
371 | timestamp=metric['timestamp'], | |||
|
372 | resource_id=application.resource_id, | |||
|
373 | namespace=metric['namespace'], | |||
|
374 | tags=tags) | |||
|
375 | rows.append(new_metric) | |||
|
376 | es_docs.append(new_metric.es_doc()) | |||
|
377 | session = DBSession() | |||
|
378 | session.bulk_save_objects(rows) | |||
|
379 | session.flush() | |||
|
380 | ||||
|
381 | action = 'METRICS' | |||
|
382 | metrics_msg = '%s: %s, metrics: %s, proto:%s' % ( | |||
|
383 | action, | |||
|
384 | str(application), | |||
|
385 | len(dataset), | |||
|
386 | proto_version | |||
|
387 | ) | |||
|
388 | log.info(metrics_msg) | |||
|
389 | ||||
|
390 | mark_changed(session) | |||
|
391 | key = REDIS_KEYS['counters']['metrics_per_minute'].format(current_time) | |||
|
392 | Datastores.redis.incr(key, len(rows)) | |||
|
393 | Datastores.redis.expire(key, 3600 * 24) | |||
|
394 | key = REDIS_KEYS['counters']['metrics_per_minute_per_app'].format( | |||
|
395 | resource_id, current_time) | |||
|
396 | Datastores.redis.incr(key, len(rows)) | |||
|
397 | Datastores.redis.expire(key, 3600 * 24) | |||
|
398 | add_metrics_es(es_docs) | |||
|
399 | return True | |||
|
400 | except Exception as exc: | |||
|
401 | print_traceback(log) | |||
|
402 | add_metrics.retry(exc=exc) | |||
|
403 | ||||
|
404 | ||||
|
405 | @celery.task(queue="es", default_retry_delay=600, max_retries=999) | |||
|
406 | def add_metrics_es(es_docs): | |||
|
407 | for doc in es_docs: | |||
|
408 | partition = 'rcae_m_%s' % doc['timestamp'].strftime('%Y_%m_%d') | |||
|
409 | Datastores.es.index(partition, 'log', doc) | |||
|
410 | ||||
|
411 | ||||
|
412 | @celery.task(queue="default", default_retry_delay=5, max_retries=2) | |||
|
413 | def check_user_report_notifications(resource_id, since_when=None): | |||
|
414 | try: | |||
|
415 | request = get_current_request() | |||
|
416 | application = ApplicationService.by_id(resource_id) | |||
|
417 | if not application: | |||
|
418 | return | |||
|
419 | error_key = REDIS_KEYS['reports_to_notify_per_type_per_app'].format( | |||
|
420 | ReportType.error, resource_id) | |||
|
421 | slow_key = REDIS_KEYS['reports_to_notify_per_type_per_app'].format( | |||
|
422 | ReportType.slow, resource_id) | |||
|
423 | error_group_ids = Datastores.redis.smembers(error_key) | |||
|
424 | slow_group_ids = Datastores.redis.smembers(slow_key) | |||
|
425 | Datastores.redis.delete(error_key) | |||
|
426 | Datastores.redis.delete(slow_key) | |||
|
427 | err_gids = [int(g_id) for g_id in error_group_ids] | |||
|
428 | slow_gids = [int(g_id) for g_id in list(slow_group_ids)] | |||
|
429 | group_ids = err_gids + slow_gids | |||
|
430 | occurence_dict = {} | |||
|
431 | for g_id in group_ids: | |||
|
432 | key = REDIS_KEYS['counters']['report_group_occurences'].format( | |||
|
433 | g_id) | |||
|
434 | val = Datastores.redis.get(key) | |||
|
435 | Datastores.redis.delete(key) | |||
|
436 | if val: | |||
|
437 | occurence_dict[g_id] = int(val) | |||
|
438 | else: | |||
|
439 | occurence_dict[g_id] = 1 | |||
|
440 | report_groups = ReportGroupService.by_ids(group_ids) | |||
|
441 | report_groups.options(sa.orm.joinedload(ReportGroup.last_report_ref)) | |||
|
442 | ||||
|
443 | ApplicationService.check_for_groups_alert( | |||
|
444 | application, 'alert', report_groups=report_groups, | |||
|
445 | occurence_dict=occurence_dict, since_when=since_when) | |||
|
446 | users = set([p.user for p in application.users_for_perm('view')]) | |||
|
447 | report_groups = report_groups.all() | |||
|
448 | for user in users: | |||
|
449 | UserService.report_notify(user, request, application, | |||
|
450 | report_groups=report_groups, | |||
|
451 | occurence_dict=occurence_dict, | |||
|
452 | since_when=since_when) | |||
|
453 | for group in report_groups: | |||
|
454 | # marks report_groups as notified | |||
|
455 | if not group.notified: | |||
|
456 | group.notified = True | |||
|
457 | except Exception as exc: | |||
|
458 | print_traceback(log) | |||
|
459 | raise | |||
|
460 | ||||
|
461 | ||||
|
462 | @celery.task(queue="default", default_retry_delay=1, max_retries=2) | |||
|
463 | def close_alerts(since_when=None): | |||
|
464 | log.warning('Checking alerts') | |||
|
465 | try: | |||
|
466 | event_types = [Event.types['error_report_alert'], | |||
|
467 | Event.types['slow_report_alert'], ] | |||
|
468 | statuses = [Event.statuses['active']] | |||
|
469 | # get events older than 5 min | |||
|
470 | events = EventService.by_type_and_status( | |||
|
471 | event_types, | |||
|
472 | statuses, | |||
|
473 | older_than=(since_when - timedelta(minutes=5))) | |||
|
474 | for event in events: | |||
|
475 | # see if we can close them | |||
|
476 | event.validate_or_close( | |||
|
477 | since_when=(since_when - timedelta(minutes=1))) | |||
|
478 | except Exception as exc: | |||
|
479 | print_traceback(log) | |||
|
480 | raise | |||
|
481 | ||||
|
482 | ||||
|
483 | @celery.task(queue="default", default_retry_delay=600, max_retries=999) | |||
|
484 | def update_tag_counter(tag_name, tag_value, count): | |||
|
485 | try: | |||
|
486 | query = DBSession.query(Tag).filter(Tag.name == tag_name).filter( | |||
|
487 | sa.cast(Tag.value, sa.types.TEXT) == sa.cast(json.dumps(tag_value), | |||
|
488 | sa.types.TEXT)) | |||
|
489 | query.update({'times_seen': Tag.times_seen + count, | |||
|
490 | 'last_timestamp': datetime.utcnow()}, | |||
|
491 | synchronize_session=False) | |||
|
492 | session = DBSession() | |||
|
493 | mark_changed(session) | |||
|
494 | return True | |||
|
495 | except Exception as exc: | |||
|
496 | print_traceback(log) | |||
|
497 | update_tag_counter.retry(exc=exc) | |||
|
498 | ||||
|
499 | ||||
|
500 | @celery.task(queue="default") | |||
|
501 | def update_tag_counters(): | |||
|
502 | """ | |||
|
503 | Sets task to update counters for application tags | |||
|
504 | """ | |||
|
505 | tags = Datastores.redis.lrange(REDIS_KEYS['seen_tag_list'], 0, -1) | |||
|
506 | Datastores.redis.delete(REDIS_KEYS['seen_tag_list']) | |||
|
507 | c = collections.Counter(tags) | |||
|
508 | for t_json, count in c.items(): | |||
|
509 | tag_info = json.loads(t_json) | |||
|
510 | update_tag_counter.delay(tag_info[0], tag_info[1], count) | |||
|
511 | ||||
|
512 | ||||
|
513 | @celery.task(queue="default") | |||
|
514 | def daily_digest(): | |||
|
515 | """ | |||
|
516 | Sends daily digest with top 50 error reports | |||
|
517 | """ | |||
|
518 | request = get_current_request() | |||
|
519 | apps = Datastores.redis.smembers(REDIS_KEYS['apps_that_had_reports']) | |||
|
520 | Datastores.redis.delete(REDIS_KEYS['apps_that_had_reports']) | |||
|
521 | since_when = datetime.utcnow() - timedelta(hours=8) | |||
|
522 | log.warning('Generating daily digests') | |||
|
523 | for resource_id in apps: | |||
|
524 | resource_id = resource_id.decode('utf8') | |||
|
525 | end_date = datetime.utcnow().replace(microsecond=0, second=0) | |||
|
526 | filter_settings = {'resource': [resource_id], | |||
|
527 | 'tags': [{'name': 'type', | |||
|
528 | 'value': ['error'], 'op': None}], | |||
|
529 | 'type': 'error', 'start_date': since_when, | |||
|
530 | 'end_date': end_date} | |||
|
531 | ||||
|
532 | reports = ReportGroupService.get_trending( | |||
|
533 | request, filter_settings=filter_settings, limit=50) | |||
|
534 | ||||
|
535 | application = ApplicationService.by_id(resource_id) | |||
|
536 | if application: | |||
|
537 | users = set([p.user for p in application.users_for_perm('view')]) | |||
|
538 | for user in users: | |||
|
539 | user.send_digest(request, application, reports=reports, | |||
|
540 | since_when=since_when) | |||
|
541 | ||||
|
542 | ||||
|
543 | @celery.task(queue="default") | |||
|
544 | def alerting(): | |||
|
545 | """ | |||
|
546 | Loop that checks redis for info and then issues new tasks to celery to | |||
|
547 | perform the following: | |||
|
548 | - which applications should have new alerts opened | |||
|
549 | - which currently opened alerts should be closed | |||
|
550 | """ | |||
|
551 | start_time = datetime.utcnow() | |||
|
552 | # transactions are needed for mailer | |||
|
553 | apps = Datastores.redis.smembers(REDIS_KEYS['apps_that_had_reports']) | |||
|
554 | Datastores.redis.delete(REDIS_KEYS['apps_that_had_reports']) | |||
|
555 | for app in apps: | |||
|
556 | log.warning('Notify for app: %s' % app) | |||
|
557 | check_user_report_notifications.delay(app.decode('utf8')) | |||
|
558 | # clear app ids from set | |||
|
559 | close_alerts.delay(since_when=start_time) | |||
|
560 | ||||
|
561 | ||||
|
562 | @celery.task(queue="default", soft_time_limit=3600 * 4, hard_time_limit=3600 * 4, | |||
|
563 | max_retries=999) | |||
|
564 | def logs_cleanup(resource_id, filter_settings): | |||
|
565 | request = get_current_request() | |||
|
566 | request.tm.begin() | |||
|
567 | es_query = { | |||
|
568 | "_source": False, | |||
|
569 | "size": 5000, | |||
|
570 | "query": { | |||
|
571 | "filtered": { | |||
|
572 | "filter": { | |||
|
573 | "and": [{"term": {"resource_id": resource_id}}] | |||
|
574 | } | |||
|
575 | } | |||
|
576 | } | |||
|
577 | } | |||
|
578 | ||||
|
579 | query = DBSession.query(Log).filter(Log.resource_id == resource_id) | |||
|
580 | if filter_settings['namespace']: | |||
|
581 | query = query.filter(Log.namespace == filter_settings['namespace'][0]) | |||
|
582 | es_query['query']['filtered']['filter']['and'].append( | |||
|
583 | {"term": {"namespace": filter_settings['namespace'][0]}} | |||
|
584 | ) | |||
|
585 | query.delete(synchronize_session=False) | |||
|
586 | request.tm.commit() | |||
|
587 | result = request.es_conn.search(es_query, index='rcae_l_*', | |||
|
588 | doc_type='log', es_scroll='1m', | |||
|
589 | es_search_type='scan') | |||
|
590 | scroll_id = result['_scroll_id'] | |||
|
591 | while True: | |||
|
592 | log.warning('log_cleanup, app:{} ns:{} batch'.format( | |||
|
593 | resource_id, | |||
|
594 | filter_settings['namespace'] | |||
|
595 | )) | |||
|
596 | es_docs_to_delete = [] | |||
|
597 | result = request.es_conn.send_request( | |||
|
598 | 'POST', ['_search', 'scroll'], | |||
|
599 | body=scroll_id, query_params={"scroll": '1m'}) | |||
|
600 | scroll_id = result['_scroll_id'] | |||
|
601 | if not result['hits']['hits']: | |||
|
602 | break | |||
|
603 | for doc in result['hits']['hits']: | |||
|
604 | es_docs_to_delete.append({"id": doc['_id'], | |||
|
605 | "index": doc['_index']}) | |||
|
606 | ||||
|
607 | for batch in in_batches(es_docs_to_delete, 10): | |||
|
608 | Datastores.es.bulk([Datastores.es.delete_op(doc_type='log', | |||
|
609 | **to_del) | |||
|
610 | for to_del in batch]) |
@@ -0,0 +1,24 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | def filter_callable(structure, section=None): | |||
|
23 | structure['SOMEVAL'] = '***REMOVED***' | |||
|
24 | return structure |
This diff has been collapsed as it changes many lines, (903 lines changed) Show them Hide them | |||||
@@ -0,0 +1,903 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import wtforms | |||
|
23 | import formencode | |||
|
24 | import re | |||
|
25 | import pyramid.threadlocal | |||
|
26 | import datetime | |||
|
27 | import appenlight.lib.helpers as h | |||
|
28 | ||||
|
29 | from appenlight.models.user import User | |||
|
30 | from appenlight.models.group import Group | |||
|
31 | from appenlight.models import DBSession | |||
|
32 | from appenlight.models.alert_channel import AlertChannel | |||
|
33 | from appenlight.models.integrations import IntegrationException | |||
|
34 | from appenlight.models.integrations.campfire import CampfireIntegration | |||
|
35 | from appenlight.models.integrations.bitbucket import BitbucketIntegration | |||
|
36 | from appenlight.models.integrations.github import GithubIntegration | |||
|
37 | from appenlight.models.integrations.flowdock import FlowdockIntegration | |||
|
38 | from appenlight.models.integrations.hipchat import HipchatIntegration | |||
|
39 | from appenlight.models.integrations.jira import JiraClient | |||
|
40 | from appenlight.models.integrations.slack import SlackIntegration | |||
|
41 | from appenlight.lib.ext_json import json | |||
|
42 | from wtforms.ext.csrf.form import SecureForm | |||
|
43 | from wtforms.compat import iteritems | |||
|
44 | from collections import defaultdict | |||
|
45 | ||||
|
46 | _ = str | |||
|
47 | ||||
|
48 | strip_filter = lambda x: x.strip() if x else None | |||
|
49 | uppercase_filter = lambda x: x.upper() if x else None | |||
|
50 | ||||
|
51 | FALSE_VALUES = ('false', '', False, None) | |||
|
52 | ||||
|
53 | ||||
|
54 | class CSRFException(Exception): | |||
|
55 | pass | |||
|
56 | ||||
|
57 | ||||
|
58 | class ReactorForm(SecureForm): | |||
|
59 | def __init__(self, formdata=None, obj=None, prefix='', csrf_context=None, | |||
|
60 | **kwargs): | |||
|
61 | super(ReactorForm, self).__init__(formdata=formdata, obj=obj, | |||
|
62 | prefix=prefix, | |||
|
63 | csrf_context=csrf_context, **kwargs) | |||
|
64 | self._csrf_context = csrf_context | |||
|
65 | ||||
|
66 | def generate_csrf_token(self, csrf_context): | |||
|
67 | return csrf_context.session.get_csrf_token() | |||
|
68 | ||||
|
69 | def validate_csrf_token(self, field): | |||
|
70 | request = self._csrf_context or pyramid.threadlocal.get_current_request() | |||
|
71 | is_from_auth_token = 'auth:auth_token' in request.effective_principals | |||
|
72 | if is_from_auth_token: | |||
|
73 | return True | |||
|
74 | ||||
|
75 | if field.data != field.current_token: | |||
|
76 | # try to save the day by using token from angular | |||
|
77 | if request.headers.get('X-XSRF-TOKEN') != field.current_token: | |||
|
78 | raise CSRFException('Invalid CSRF token') | |||
|
79 | ||||
|
80 | @property | |||
|
81 | def errors_dict(self): | |||
|
82 | r_dict = defaultdict(list) | |||
|
83 | for k, errors in self.errors.items(): | |||
|
84 | r_dict[k].extend([str(e) for e in errors]) | |||
|
85 | return r_dict | |||
|
86 | ||||
|
87 | @property | |||
|
88 | def errors_json(self): | |||
|
89 | return json.dumps(self.errors_dict) | |||
|
90 | ||||
|
91 | def populate_obj(self, obj, ignore_none=False): | |||
|
92 | """ | |||
|
93 | Populates the attributes of the passed `obj` with data from the form's | |||
|
94 | fields. | |||
|
95 | ||||
|
96 | :note: This is a destructive operation; Any attribute with the same name | |||
|
97 | as a field will be overridden. Use with caution. | |||
|
98 | """ | |||
|
99 | if ignore_none: | |||
|
100 | for name, field in iteritems(self._fields): | |||
|
101 | if field.data is not None: | |||
|
102 | field.populate_obj(obj, name) | |||
|
103 | else: | |||
|
104 | for name, field in iteritems(self._fields): | |||
|
105 | field.populate_obj(obj, name) | |||
|
106 | ||||
|
107 | css_classes = {} | |||
|
108 | ignore_labels = {} | |||
|
109 | ||||
|
110 | ||||
|
111 | class SignInForm(ReactorForm): | |||
|
112 | came_from = wtforms.HiddenField() | |||
|
113 | sign_in_user_name = wtforms.StringField(_('User Name')) | |||
|
114 | sign_in_user_password = wtforms.PasswordField(_('Password')) | |||
|
115 | ||||
|
116 | ignore_labels = ['submit'] | |||
|
117 | css_classes = {'submit': 'btn btn-primary'} | |||
|
118 | ||||
|
119 | html_attrs = {'sign_in_user_name': {'placeholder': 'Your login'}, | |||
|
120 | 'sign_in_user_password': { | |||
|
121 | 'placeholder': 'Your password'}} | |||
|
122 | ||||
|
123 | ||||
|
124 | from wtforms.widgets import html_params, HTMLString | |||
|
125 | ||||
|
126 | ||||
|
127 | def select_multi_checkbox(field, ul_class='set', **kwargs): | |||
|
128 | """Render a multi-checkbox widget""" | |||
|
129 | kwargs.setdefault('type', 'checkbox') | |||
|
130 | field_id = kwargs.pop('id', field.id) | |||
|
131 | html = ['<ul %s>' % html_params(id=field_id, class_=ul_class)] | |||
|
132 | for value, label, checked in field.iter_choices(): | |||
|
133 | choice_id = '%s-%s' % (field_id, value) | |||
|
134 | options = dict(kwargs, name=field.name, value=value, id=choice_id) | |||
|
135 | if checked: | |||
|
136 | options['checked'] = 'checked' | |||
|
137 | html.append('<li><input %s /> ' % html_params(**options)) | |||
|
138 | html.append('<label for="%s">%s</label></li>' % (choice_id, label)) | |||
|
139 | html.append('</ul>') | |||
|
140 | return HTMLString(''.join(html)) | |||
|
141 | ||||
|
142 | ||||
|
143 | def button_widget(field, button_cls='ButtonField btn btn-default', **kwargs): | |||
|
144 | """Render a button widget""" | |||
|
145 | kwargs.setdefault('type', 'button') | |||
|
146 | field_id = kwargs.pop('id', field.id) | |||
|
147 | kwargs.setdefault('value', field.label.text) | |||
|
148 | html = ['<button %s>%s</button>' % (html_params(id=field_id, | |||
|
149 | class_=button_cls), | |||
|
150 | kwargs['value'],)] | |||
|
151 | return HTMLString(''.join(html)) | |||
|
152 | ||||
|
153 | ||||
|
154 | def clean_whitespace(value): | |||
|
155 | if value: | |||
|
156 | return value.strip() | |||
|
157 | return value | |||
|
158 | ||||
|
159 | ||||
|
160 | def found_username_validator(form, field): | |||
|
161 | user = User.by_user_name(field.data) | |||
|
162 | # sets user to recover in email validator | |||
|
163 | form.field_user = user | |||
|
164 | if not user: | |||
|
165 | raise wtforms.ValidationError('This username does not exist') | |||
|
166 | ||||
|
167 | ||||
|
168 | def found_username_email_validator(form, field): | |||
|
169 | user = User.by_email(field.data) | |||
|
170 | if not user: | |||
|
171 | raise wtforms.ValidationError('Email is incorrect') | |||
|
172 | ||||
|
173 | ||||
|
174 | def unique_username_validator(form, field): | |||
|
175 | user = User.by_user_name(field.data) | |||
|
176 | if user: | |||
|
177 | raise wtforms.ValidationError('This username already exists in system') | |||
|
178 | ||||
|
179 | ||||
|
180 | def unique_groupname_validator(form, field): | |||
|
181 | group = Group.by_group_name(field.data) | |||
|
182 | mod_group = getattr(form, '_modified_group', None) | |||
|
183 | if group and (not mod_group or mod_group.id != group.id): | |||
|
184 | raise wtforms.ValidationError( | |||
|
185 | 'This group name already exists in system') | |||
|
186 | ||||
|
187 | ||||
|
188 | def unique_email_validator(form, field): | |||
|
189 | user = User.by_email(field.data) | |||
|
190 | if user: | |||
|
191 | raise wtforms.ValidationError('This email already exists in system') | |||
|
192 | ||||
|
193 | ||||
|
194 | def email_validator(form, field): | |||
|
195 | validator = formencode.validators.Email() | |||
|
196 | try: | |||
|
197 | validator.to_python(field.data) | |||
|
198 | except formencode.Invalid as e: | |||
|
199 | raise wtforms.ValidationError(e) | |||
|
200 | ||||
|
201 | ||||
|
202 | def unique_alert_email_validator(form, field): | |||
|
203 | q = DBSession.query(AlertChannel) | |||
|
204 | q = q.filter(AlertChannel.channel_name == 'email') | |||
|
205 | q = q.filter(AlertChannel.channel_value == field.data) | |||
|
206 | email = q.first() | |||
|
207 | if email: | |||
|
208 | raise wtforms.ValidationError( | |||
|
209 | 'This email already exists in alert system') | |||
|
210 | ||||
|
211 | ||||
|
212 | def blocked_email_validator(form, field): | |||
|
213 | blocked_emails = [ | |||
|
214 | 'goood-mail.org', | |||
|
215 | 'shoeonlineblog.com', | |||
|
216 | 'louboutinemart.com', | |||
|
217 | 'guccibagshere.com', | |||
|
218 | 'nikeshoesoutletforsale.com' | |||
|
219 | ] | |||
|
220 | data = field.data or '' | |||
|
221 | domain = data.split('@')[-1] | |||
|
222 | if domain in blocked_emails: | |||
|
223 | raise wtforms.ValidationError('Don\'t spam') | |||
|
224 | ||||
|
225 | ||||
|
226 | def old_password_validator(form, field): | |||
|
227 | if not field.user.check_password(field.data or ''): | |||
|
228 | raise wtforms.ValidationError('You need to enter correct password') | |||
|
229 | ||||
|
230 | ||||
|
231 | class UserRegisterForm(ReactorForm): | |||
|
232 | user_name = wtforms.StringField( | |||
|
233 | _('User Name'), | |||
|
234 | filters=[strip_filter], | |||
|
235 | validators=[ | |||
|
236 | wtforms.validators.Length(min=2, max=30), | |||
|
237 | wtforms.validators.Regexp( | |||
|
238 | re.compile(r'^[\.\w-]+$', re.UNICODE), | |||
|
239 | message="Invalid characters used"), | |||
|
240 | unique_username_validator, | |||
|
241 | wtforms.validators.DataRequired() | |||
|
242 | ]) | |||
|
243 | ||||
|
244 | user_password = wtforms.PasswordField(_('User Password'), | |||
|
245 | filters=[strip_filter], | |||
|
246 | validators=[ | |||
|
247 | wtforms.validators.Length(min=4), | |||
|
248 | wtforms.validators.DataRequired() | |||
|
249 | ]) | |||
|
250 | ||||
|
251 | email = wtforms.StringField(_('Email Address'), | |||
|
252 | filters=[strip_filter], | |||
|
253 | validators=[email_validator, | |||
|
254 | unique_email_validator, | |||
|
255 | blocked_email_validator, | |||
|
256 | wtforms.validators.DataRequired()], | |||
|
257 | description=_("We promise we will not share " | |||
|
258 | "your email with anyone")) | |||
|
259 | first_name = wtforms.HiddenField(_('First Name')) | |||
|
260 | last_name = wtforms.HiddenField(_('Last Name')) | |||
|
261 | ||||
|
262 | ignore_labels = ['submit'] | |||
|
263 | css_classes = {'submit': 'btn btn-primary'} | |||
|
264 | ||||
|
265 | html_attrs = {'user_name': {'placeholder': 'Your login'}, | |||
|
266 | 'user_password': {'placeholder': 'Your password'}, | |||
|
267 | 'email': {'placeholder': 'Your email'}} | |||
|
268 | ||||
|
269 | ||||
|
270 | class UserCreateForm(UserRegisterForm): | |||
|
271 | status = wtforms.BooleanField('User status', | |||
|
272 | false_values=FALSE_VALUES) | |||
|
273 | ||||
|
274 | ||||
|
275 | class UserUpdateForm(UserCreateForm): | |||
|
276 | user_name = None | |||
|
277 | user_password = wtforms.PasswordField(_('User Password'), | |||
|
278 | filters=[strip_filter], | |||
|
279 | validators=[ | |||
|
280 | wtforms.validators.Length(min=4), | |||
|
281 | wtforms.validators.Optional() | |||
|
282 | ]) | |||
|
283 | email = wtforms.StringField(_('Email Address'), | |||
|
284 | filters=[strip_filter], | |||
|
285 | validators=[email_validator, | |||
|
286 | wtforms.validators.DataRequired()]) | |||
|
287 | ||||
|
288 | ||||
|
289 | class LostPasswordForm(ReactorForm): | |||
|
290 | email = wtforms.StringField(_('Email Address'), | |||
|
291 | filters=[strip_filter], | |||
|
292 | validators=[email_validator, | |||
|
293 | found_username_email_validator, | |||
|
294 | wtforms.validators.DataRequired()]) | |||
|
295 | ||||
|
296 | submit = wtforms.SubmitField(_('Reset password')) | |||
|
297 | ignore_labels = ['submit'] | |||
|
298 | css_classes = {'submit': 'btn btn-primary'} | |||
|
299 | ||||
|
300 | ||||
|
301 | class ChangePasswordForm(ReactorForm): | |||
|
302 | old_password = wtforms.PasswordField( | |||
|
303 | 'Old Password', | |||
|
304 | filters=[strip_filter], | |||
|
305 | validators=[old_password_validator, | |||
|
306 | wtforms.validators.DataRequired()]) | |||
|
307 | ||||
|
308 | new_password = wtforms.PasswordField( | |||
|
309 | 'New Password', | |||
|
310 | filters=[strip_filter], | |||
|
311 | validators=[wtforms.validators.Length(min=4), | |||
|
312 | wtforms.validators.DataRequired()]) | |||
|
313 | new_password_confirm = wtforms.PasswordField( | |||
|
314 | 'Confirm Password', | |||
|
315 | filters=[strip_filter], | |||
|
316 | validators=[wtforms.validators.EqualTo('new_password'), | |||
|
317 | wtforms.validators.DataRequired()]) | |||
|
318 | submit = wtforms.SubmitField('Change Password') | |||
|
319 | ignore_labels = ['submit'] | |||
|
320 | css_classes = {'submit': 'btn btn-primary'} | |||
|
321 | ||||
|
322 | ||||
|
323 | class CheckPasswordForm(ReactorForm): | |||
|
324 | password = wtforms.PasswordField( | |||
|
325 | 'Password', | |||
|
326 | filters=[strip_filter], | |||
|
327 | validators=[old_password_validator, | |||
|
328 | wtforms.validators.DataRequired()]) | |||
|
329 | ||||
|
330 | ||||
|
331 | class NewPasswordForm(ReactorForm): | |||
|
332 | new_password = wtforms.PasswordField( | |||
|
333 | 'New Password', | |||
|
334 | filters=[strip_filter], | |||
|
335 | validators=[wtforms.validators.Length(min=4), | |||
|
336 | wtforms.validators.DataRequired()]) | |||
|
337 | new_password_confirm = wtforms.PasswordField( | |||
|
338 | 'Confirm Password', | |||
|
339 | filters=[strip_filter], | |||
|
340 | validators=[wtforms.validators.EqualTo('new_password'), | |||
|
341 | wtforms.validators.DataRequired()]) | |||
|
342 | submit = wtforms.SubmitField('Set Password') | |||
|
343 | ignore_labels = ['submit'] | |||
|
344 | css_classes = {'submit': 'btn btn-primary'} | |||
|
345 | ||||
|
346 | ||||
|
347 | class CORSTextAreaField(wtforms.StringField): | |||
|
348 | """ | |||
|
349 | This field represents an HTML ``<textarea>`` and can be used to take | |||
|
350 | multi-line input. | |||
|
351 | """ | |||
|
352 | widget = wtforms.widgets.TextArea() | |||
|
353 | ||||
|
354 | def process_formdata(self, valuelist): | |||
|
355 | self.data = [] | |||
|
356 | if valuelist: | |||
|
357 | data = [x.strip() for x in valuelist[0].split('\n')] | |||
|
358 | for d in data: | |||
|
359 | if not d: | |||
|
360 | continue | |||
|
361 | if d.startswith('www.'): | |||
|
362 | d = d[4:] | |||
|
363 | if data: | |||
|
364 | self.data.append(d) | |||
|
365 | else: | |||
|
366 | self.data = [] | |||
|
367 | self.data = '\n'.join(self.data) | |||
|
368 | ||||
|
369 | ||||
|
370 | class ApplicationCreateForm(ReactorForm): | |||
|
371 | resource_name = wtforms.StringField( | |||
|
372 | _('Application name'), | |||
|
373 | filters=[strip_filter], | |||
|
374 | validators=[wtforms.validators.Length(min=1), | |||
|
375 | wtforms.validators.DataRequired()]) | |||
|
376 | ||||
|
377 | domains = CORSTextAreaField( | |||
|
378 | _('Domain names for CORS headers '), | |||
|
379 | validators=[wtforms.validators.Length(min=1), | |||
|
380 | wtforms.validators.Optional()], | |||
|
381 | description='Required for Javascript error ' | |||
|
382 | 'tracking (one line one domain, skip http:// part)') | |||
|
383 | ||||
|
384 | submit = wtforms.SubmitField(_('Create Application')) | |||
|
385 | ||||
|
386 | ignore_labels = ['submit'] | |||
|
387 | css_classes = {'submit': 'btn btn-primary'} | |||
|
388 | html_attrs = {'resource_name': {'placeholder': 'Application Name'}, | |||
|
389 | 'uptime_url': {'placeholder': 'http://somedomain.com'}} | |||
|
390 | ||||
|
391 | ||||
|
392 | class ApplicationUpdateForm(ApplicationCreateForm): | |||
|
393 | default_grouping = wtforms.SelectField( | |||
|
394 | _('Default grouping for errors'), | |||
|
395 | choices=[('url_type', 'Error Type + location',), | |||
|
396 | ('url_traceback', 'Traceback + location',), | |||
|
397 | ('traceback_server', 'Traceback + Server',)], | |||
|
398 | default='url_traceback') | |||
|
399 | ||||
|
400 | error_report_threshold = wtforms.IntegerField( | |||
|
401 | _('Alert on error reports'), | |||
|
402 | validators=[ | |||
|
403 | wtforms.validators.NumberRange(min=1), | |||
|
404 | wtforms.validators.DataRequired() | |||
|
405 | ], | |||
|
406 | description='Application requires to send at least this amount of ' | |||
|
407 | 'error reports per minute to open alert' | |||
|
408 | ) | |||
|
409 | ||||
|
410 | slow_report_threshold = wtforms.IntegerField( | |||
|
411 | _('Alert on slow reports'), | |||
|
412 | validators=[wtforms.validators.NumberRange(min=1), | |||
|
413 | wtforms.validators.DataRequired()], | |||
|
414 | description='Application requires to send at least this amount of ' | |||
|
415 | 'slow reports per minute to open alert') | |||
|
416 | ||||
|
417 | allow_permanent_storage = wtforms.BooleanField( | |||
|
418 | _('Permanent logs'), | |||
|
419 | false_values=FALSE_VALUES, | |||
|
420 | description=_( | |||
|
421 | 'Allow permanent storage of logs in separate DB partitions')) | |||
|
422 | ||||
|
423 | submit = wtforms.SubmitField(_('Create Application')) | |||
|
424 | ||||
|
425 | ||||
|
426 | class UserSearchSchemaForm(ReactorForm): | |||
|
427 | user_name = wtforms.StringField('User Name', | |||
|
428 | filters=[strip_filter], ) | |||
|
429 | ||||
|
430 | submit = wtforms.SubmitField(_('Search User')) | |||
|
431 | ignore_labels = ['submit'] | |||
|
432 | css_classes = {'submit': 'btn btn-primary'} | |||
|
433 | ||||
|
434 | '<li class="user_exists"><span></span></li>' | |||
|
435 | ||||
|
436 | ||||
|
437 | class YesNoForm(ReactorForm): | |||
|
438 | no = wtforms.SubmitField('No', default='') | |||
|
439 | yes = wtforms.SubmitField('Yes', default='') | |||
|
440 | ignore_labels = ['submit'] | |||
|
441 | css_classes = {'submit': 'btn btn-primary'} | |||
|
442 | ||||
|
443 | ||||
|
444 | status_codes = [('', 'All',), ('500', '500',), ('404', '404',)] | |||
|
445 | ||||
|
446 | priorities = [('', 'All',)] | |||
|
447 | for i in range(1, 11): | |||
|
448 | priorities.append((str(i), str(i),)) | |||
|
449 | ||||
|
450 | report_status_choices = [('', 'All',), | |||
|
451 | ('never_reviewed', 'Never revieved',), | |||
|
452 | ('reviewed', 'Revieved',), | |||
|
453 | ('public', 'Public',), | |||
|
454 | ('fixed', 'Fixed',), ] | |||
|
455 | ||||
|
456 | ||||
|
457 | class ReportBrowserForm(ReactorForm): | |||
|
458 | applications = wtforms.SelectMultipleField('Applications', | |||
|
459 | widget=select_multi_checkbox) | |||
|
460 | http_status = wtforms.SelectField('HTTP Status', choices=status_codes) | |||
|
461 | priority = wtforms.SelectField('Priority', choices=priorities, default='') | |||
|
462 | start_date = wtforms.DateField('Start Date') | |||
|
463 | end_date = wtforms.DateField('End Date') | |||
|
464 | error = wtforms.StringField('Error') | |||
|
465 | url_path = wtforms.StringField('URL Path') | |||
|
466 | url_domain = wtforms.StringField('URL Domain') | |||
|
467 | report_status = wtforms.SelectField('Report status', | |||
|
468 | choices=report_status_choices, | |||
|
469 | default='') | |||
|
470 | submit = wtforms.SubmitField('<span class="glyphicon glyphicon-search">' | |||
|
471 | '</span> Filter results', | |||
|
472 | widget=button_widget) | |||
|
473 | ||||
|
474 | ignore_labels = ['submit'] | |||
|
475 | css_classes = {'submit': 'btn btn-primary'} | |||
|
476 | ||||
|
477 | ||||
|
478 | slow_report_status_choices = [('', 'All',), | |||
|
479 | ('never_reviewed', 'Never revieved',), | |||
|
480 | ('reviewed', 'Revieved',), | |||
|
481 | ('public', 'Public',), ] | |||
|
482 | ||||
|
483 | ||||
|
484 | class BulkOperationForm(ReactorForm): | |||
|
485 | applications = wtforms.SelectField('Applications') | |||
|
486 | start_date = wtforms.DateField( | |||
|
487 | 'Start Date', | |||
|
488 | default=lambda: datetime.datetime.utcnow() - datetime.timedelta( | |||
|
489 | days=90)) | |||
|
490 | end_date = wtforms.DateField('End Date') | |||
|
491 | confirm = wtforms.BooleanField( | |||
|
492 | 'Confirm operation', | |||
|
493 | validators=[wtforms.validators.DataRequired()]) | |||
|
494 | ||||
|
495 | ||||
|
496 | class LogBrowserForm(ReactorForm): | |||
|
497 | applications = wtforms.SelectMultipleField('Applications', | |||
|
498 | widget=select_multi_checkbox) | |||
|
499 | start_date = wtforms.DateField('Start Date') | |||
|
500 | log_level = wtforms.StringField('Log level') | |||
|
501 | message = wtforms.StringField('Message') | |||
|
502 | namespace = wtforms.StringField('Namespace') | |||
|
503 | submit = wtforms.SubmitField( | |||
|
504 | '<span class="glyphicon glyphicon-search"></span> Filter results', | |||
|
505 | widget=button_widget) | |||
|
506 | ignore_labels = ['submit'] | |||
|
507 | css_classes = {'submit': 'btn btn-primary'} | |||
|
508 | ||||
|
509 | ||||
|
510 | class CommentForm(ReactorForm): | |||
|
511 | body = wtforms.TextAreaField('Comment', validators=[ | |||
|
512 | wtforms.validators.Length(min=1), | |||
|
513 | wtforms.validators.DataRequired() | |||
|
514 | ]) | |||
|
515 | submit = wtforms.SubmitField('Comment', ) | |||
|
516 | ignore_labels = ['submit'] | |||
|
517 | css_classes = {'submit': 'btn btn-primary'} | |||
|
518 | ||||
|
519 | ||||
|
520 | class EmailChannelCreateForm(ReactorForm): | |||
|
521 | email = wtforms.StringField(_('Email Address'), | |||
|
522 | filters=[strip_filter], | |||
|
523 | validators=[email_validator, | |||
|
524 | unique_alert_email_validator, | |||
|
525 | wtforms.validators.DataRequired()]) | |||
|
526 | submit = wtforms.SubmitField('Add email channel', ) | |||
|
527 | ignore_labels = ['submit'] | |||
|
528 | css_classes = {'submit': 'btn btn-primary'} | |||
|
529 | ||||
|
530 | ||||
|
531 | def gen_user_profile_form(): | |||
|
532 | class UserProfileForm(ReactorForm): | |||
|
533 | email = wtforms.StringField( | |||
|
534 | _('Email Address'), | |||
|
535 | validators=[email_validator, wtforms.validators.DataRequired()]) | |||
|
536 | first_name = wtforms.StringField(_('First Name')) | |||
|
537 | last_name = wtforms.StringField(_('Last Name')) | |||
|
538 | company_name = wtforms.StringField(_('Company Name')) | |||
|
539 | company_address = wtforms.TextAreaField(_('Company Address')) | |||
|
540 | zip_code = wtforms.StringField(_('ZIP code')) | |||
|
541 | city = wtforms.StringField(_('City')) | |||
|
542 | notifications = wtforms.BooleanField('Account notifications', | |||
|
543 | false_values=FALSE_VALUES) | |||
|
544 | submit = wtforms.SubmitField(_('Update Account')) | |||
|
545 | ignore_labels = ['submit'] | |||
|
546 | css_classes = {'submit': 'btn btn-primary'} | |||
|
547 | ||||
|
548 | return UserProfileForm | |||
|
549 | ||||
|
550 | ||||
|
551 | class PurgeAppForm(ReactorForm): | |||
|
552 | resource_id = wtforms.HiddenField( | |||
|
553 | 'App Id', | |||
|
554 | validators=[wtforms.validators.DataRequired()]) | |||
|
555 | days = wtforms.IntegerField( | |||
|
556 | 'Days', | |||
|
557 | validators=[wtforms.validators.DataRequired()]) | |||
|
558 | password = wtforms.PasswordField( | |||
|
559 | 'Admin Password', | |||
|
560 | validators=[old_password_validator, wtforms.validators.DataRequired()]) | |||
|
561 | submit = wtforms.SubmitField(_('Purge Data')) | |||
|
562 | ignore_labels = ['submit'] | |||
|
563 | css_classes = {'submit': 'btn btn-primary'} | |||
|
564 | ||||
|
565 | ||||
|
566 | class IntegrationRepoForm(ReactorForm): | |||
|
567 | host_name = wtforms.StringField("Service Host", default='') | |||
|
568 | user_name = wtforms.StringField( | |||
|
569 | "User Name", | |||
|
570 | filters=[strip_filter], | |||
|
571 | validators=[wtforms.validators.DataRequired(), | |||
|
572 | wtforms.validators.Length(min=1)]) | |||
|
573 | repo_name = wtforms.StringField( | |||
|
574 | "Repo Name", | |||
|
575 | filters=[strip_filter], | |||
|
576 | validators=[wtforms.validators.DataRequired(), | |||
|
577 | wtforms.validators.Length(min=1)]) | |||
|
578 | ||||
|
579 | ||||
|
580 | class IntegrationBitbucketForm(IntegrationRepoForm): | |||
|
581 | host_name = wtforms.StringField("Service Host", | |||
|
582 | default='https://bitbucket.org') | |||
|
583 | ||||
|
584 | def validate_user_name(self, field): | |||
|
585 | try: | |||
|
586 | request = pyramid.threadlocal.get_current_request() | |||
|
587 | client = BitbucketIntegration.create_client( | |||
|
588 | request, | |||
|
589 | self.user_name.data, | |||
|
590 | self.repo_name.data) | |||
|
591 | client.get_assignees() | |||
|
592 | except IntegrationException as e: | |||
|
593 | raise wtforms.validators.ValidationError(str(e)) | |||
|
594 | ||||
|
595 | ||||
|
596 | class IntegrationGithubForm(IntegrationRepoForm): | |||
|
597 | host_name = wtforms.StringField("Service Host", | |||
|
598 | default='https://github.com') | |||
|
599 | ||||
|
600 | def validate_user_name(self, field): | |||
|
601 | try: | |||
|
602 | request = pyramid.threadlocal.get_current_request() | |||
|
603 | client = GithubIntegration.create_client( | |||
|
604 | request, | |||
|
605 | self.user_name.data, | |||
|
606 | self.repo_name.data) | |||
|
607 | client.get_assignees() | |||
|
608 | except IntegrationException as e: | |||
|
609 | raise wtforms.validators.ValidationError(str(e)) | |||
|
610 | raise wtforms.validators.ValidationError(str(e)) | |||
|
611 | ||||
|
612 | ||||
|
613 | def filter_rooms(data): | |||
|
614 | if data is not None: | |||
|
615 | rooms = data.split(',') | |||
|
616 | return ','.join([r.strip() for r in rooms]) | |||
|
617 | ||||
|
618 | ||||
|
619 | class IntegrationCampfireForm(ReactorForm): | |||
|
620 | account = wtforms.StringField( | |||
|
621 | 'Account', | |||
|
622 | filters=[strip_filter], | |||
|
623 | validators=[wtforms.validators.DataRequired()]) | |||
|
624 | api_token = wtforms.StringField( | |||
|
625 | 'Api Token', | |||
|
626 | filters=[strip_filter], | |||
|
627 | validators=[wtforms.validators.DataRequired()]) | |||
|
628 | rooms = wtforms.StringField('Room ID list', filters=[filter_rooms]) | |||
|
629 | ||||
|
630 | def validate_api_token(self, field): | |||
|
631 | try: | |||
|
632 | client = CampfireIntegration.create_client(self.api_token.data, | |||
|
633 | self.account.data) | |||
|
634 | client.get_account() | |||
|
635 | except IntegrationException as e: | |||
|
636 | raise wtforms.validators.ValidationError(str(e)) | |||
|
637 | ||||
|
638 | def validate_rooms(self, field): | |||
|
639 | if not field.data: | |||
|
640 | return | |||
|
641 | client = CampfireIntegration.create_client(self.api_token.data, | |||
|
642 | self.account.data) | |||
|
643 | ||||
|
644 | try: | |||
|
645 | room_list = [r['id'] for r in client.get_rooms()] | |||
|
646 | except IntegrationException as e: | |||
|
647 | raise wtforms.validators.ValidationError(str(e)) | |||
|
648 | ||||
|
649 | rooms = field.data.split(',') | |||
|
650 | if len(rooms) > 3: | |||
|
651 | msg = 'You can use up to 3 room ids' | |||
|
652 | raise wtforms.validators.ValidationError(msg) | |||
|
653 | if rooms: | |||
|
654 | for room_id in rooms: | |||
|
655 | if int(room_id) not in room_list: | |||
|
656 | msg = "Room %s doesn't exist" | |||
|
657 | raise wtforms.validators.ValidationError(msg % room_id) | |||
|
658 | if not room_id.strip().isdigit(): | |||
|
659 | msg = 'You must use only integers for room ids' | |||
|
660 | raise wtforms.validators.ValidationError(msg) | |||
|
661 | ||||
|
662 | submit = wtforms.SubmitField(_('Connect to Campfire')) | |||
|
663 | ignore_labels = ['submit'] | |||
|
664 | css_classes = {'submit': 'btn btn-primary'} | |||
|
665 | ||||
|
666 | ||||
|
667 | def filter_rooms(data): | |||
|
668 | if data is not None: | |||
|
669 | rooms = data.split(',') | |||
|
670 | return ','.join([r.strip() for r in rooms]) | |||
|
671 | ||||
|
672 | ||||
|
673 | class IntegrationHipchatForm(ReactorForm): | |||
|
674 | api_token = wtforms.StringField( | |||
|
675 | 'Api Token', | |||
|
676 | filters=[strip_filter], | |||
|
677 | validators=[wtforms.validators.DataRequired()]) | |||
|
678 | rooms = wtforms.StringField( | |||
|
679 | 'Room ID list', | |||
|
680 | filters=[filter_rooms], | |||
|
681 | validators=[wtforms.validators.DataRequired()]) | |||
|
682 | ||||
|
683 | def validate_rooms(self, field): | |||
|
684 | if not field.data: | |||
|
685 | return | |||
|
686 | client = HipchatIntegration.create_client(self.api_token.data) | |||
|
687 | rooms = field.data.split(',') | |||
|
688 | if len(rooms) > 3: | |||
|
689 | msg = 'You can use up to 3 room ids' | |||
|
690 | raise wtforms.validators.ValidationError(msg) | |||
|
691 | if rooms: | |||
|
692 | for room_id in rooms: | |||
|
693 | if not room_id.strip().isdigit(): | |||
|
694 | msg = 'You must use only integers for room ids' | |||
|
695 | raise wtforms.validators.ValidationError(msg) | |||
|
696 | try: | |||
|
697 | client.send({ | |||
|
698 | "message_format": 'text', | |||
|
699 | "message": "testing for room existence", | |||
|
700 | "from": "App Enlight", | |||
|
701 | "room_id": room_id, | |||
|
702 | "color": "green" | |||
|
703 | }) | |||
|
704 | except IntegrationException as exc: | |||
|
705 | msg = 'Room id: %s exception: %s' | |||
|
706 | raise wtforms.validators.ValidationError(msg % (room_id, | |||
|
707 | exc)) | |||
|
708 | ||||
|
709 | ||||
|
710 | class IntegrationFlowdockForm(ReactorForm): | |||
|
711 | api_token = wtforms.StringField('API Token', | |||
|
712 | filters=[strip_filter], | |||
|
713 | validators=[ | |||
|
714 | wtforms.validators.DataRequired() | |||
|
715 | ], ) | |||
|
716 | ||||
|
717 | def validate_api_token(self, field): | |||
|
718 | try: | |||
|
719 | client = FlowdockIntegration.create_client(self.api_token.data) | |||
|
720 | registry = pyramid.threadlocal.get_current_registry() | |||
|
721 | payload = { | |||
|
722 | "source": registry.settings['mailing.from_name'], | |||
|
723 | "from_address": registry.settings['mailing.from_email'], | |||
|
724 | "subject": "Integration test", | |||
|
725 | "content": "If you can see this it was successful", | |||
|
726 | "tags": ["appenlight"], | |||
|
727 | "link": registry.settings['mailing.app_url'] | |||
|
728 | } | |||
|
729 | client.send_to_inbox(payload) | |||
|
730 | except IntegrationException as e: | |||
|
731 | raise wtforms.validators.ValidationError(str(e)) | |||
|
732 | ||||
|
733 | ||||
|
734 | class IntegrationSlackForm(ReactorForm): | |||
|
735 | webhook_url = wtforms.StringField( | |||
|
736 | 'Reports webhook', | |||
|
737 | filters=[strip_filter], | |||
|
738 | validators=[wtforms.validators.DataRequired()]) | |||
|
739 | ||||
|
740 | def validate_webhook_url(self, field): | |||
|
741 | registry = pyramid.threadlocal.get_current_registry() | |||
|
742 | client = SlackIntegration.create_client(field.data) | |||
|
743 | link = "<%s|%s>" % (registry.settings['mailing.app_url'], | |||
|
744 | registry.settings['mailing.from_name']) | |||
|
745 | test_data = { | |||
|
746 | "username": "App Enlight", | |||
|
747 | "icon_emoji": ":fire:", | |||
|
748 | "attachments": [ | |||
|
749 | {"fallback": "Testing integration channel: %s" % link, | |||
|
750 | "pretext": "Testing integration channel: %s" % link, | |||
|
751 | "color": "good", | |||
|
752 | "fields": [ | |||
|
753 | { | |||
|
754 | "title": "Status", | |||
|
755 | "value": "Integration is working fine", | |||
|
756 | "short": False | |||
|
757 | } | |||
|
758 | ]} | |||
|
759 | ] | |||
|
760 | } | |||
|
761 | try: | |||
|
762 | client.make_request(data=test_data) | |||
|
763 | except IntegrationException as exc: | |||
|
764 | raise wtforms.validators.ValidationError(str(exc)) | |||
|
765 | ||||
|
766 | ||||
|
767 | class IntegrationWebhooksForm(ReactorForm): | |||
|
768 | reports_webhook = wtforms.StringField( | |||
|
769 | 'Reports webhook', | |||
|
770 | filters=[strip_filter], | |||
|
771 | validators=[wtforms.validators.DataRequired()]) | |||
|
772 | alerts_webhook = wtforms.StringField( | |||
|
773 | 'Alerts webhook', | |||
|
774 | filters=[strip_filter], | |||
|
775 | validators=[wtforms.validators.DataRequired()]) | |||
|
776 | submit = wtforms.SubmitField(_('Setup webhooks')) | |||
|
777 | ignore_labels = ['submit'] | |||
|
778 | css_classes = {'submit': 'btn btn-primary'} | |||
|
779 | ||||
|
780 | ||||
|
781 | class IntegrationJiraForm(ReactorForm): | |||
|
782 | host_name = wtforms.StringField( | |||
|
783 | 'Server URL', | |||
|
784 | filters=[strip_filter], | |||
|
785 | validators=[wtforms.validators.DataRequired()]) | |||
|
786 | user_name = wtforms.StringField( | |||
|
787 | 'Username', | |||
|
788 | filters=[strip_filter], | |||
|
789 | validators=[wtforms.validators.DataRequired()]) | |||
|
790 | password = wtforms.PasswordField( | |||
|
791 | 'Password', | |||
|
792 | filters=[strip_filter], | |||
|
793 | validators=[wtforms.validators.DataRequired()]) | |||
|
794 | project = wtforms.StringField( | |||
|
795 | 'Project key', | |||
|
796 | filters=[uppercase_filter, strip_filter], | |||
|
797 | validators=[wtforms.validators.DataRequired()]) | |||
|
798 | ||||
|
799 | def validate_project(self, field): | |||
|
800 | if not field.data: | |||
|
801 | return | |||
|
802 | try: | |||
|
803 | client = JiraClient(self.user_name.data, | |||
|
804 | self.password.data, | |||
|
805 | self.host_name.data, | |||
|
806 | self.project.data) | |||
|
807 | except Exception as exc: | |||
|
808 | raise wtforms.validators.ValidationError(str(exc)) | |||
|
809 | ||||
|
810 | room_list = [r.key.upper() for r in client.get_projects()] | |||
|
811 | if field.data.upper() not in room_list: | |||
|
812 | msg = "Project %s doesn\t exist in your Jira Instance" | |||
|
813 | raise wtforms.validators.ValidationError(msg % field.data) | |||
|
814 | ||||
|
815 | ||||
|
816 | def get_deletion_form(resource): | |||
|
817 | class F(ReactorForm): | |||
|
818 | application_name = wtforms.StringField( | |||
|
819 | 'Application Name', | |||
|
820 | filters=[strip_filter], | |||
|
821 | validators=[wtforms.validators.AnyOf([resource.resource_name])]) | |||
|
822 | resource_id = wtforms.HiddenField(default=resource.resource_id) | |||
|
823 | submit = wtforms.SubmitField(_('Delete my application')) | |||
|
824 | ignore_labels = ['submit'] | |||
|
825 | css_classes = {'submit': 'btn btn-danger'} | |||
|
826 | ||||
|
827 | return F | |||
|
828 | ||||
|
829 | ||||
|
830 | class ChangeApplicationOwnerForm(ReactorForm): | |||
|
831 | password = wtforms.PasswordField( | |||
|
832 | 'Password', | |||
|
833 | filters=[strip_filter], | |||
|
834 | validators=[old_password_validator, | |||
|
835 | wtforms.validators.DataRequired()]) | |||
|
836 | ||||
|
837 | user_name = wtforms.StringField( | |||
|
838 | 'New owners username', | |||
|
839 | filters=[strip_filter], | |||
|
840 | validators=[found_username_validator, | |||
|
841 | wtforms.validators.DataRequired()]) | |||
|
842 | submit = wtforms.SubmitField(_('Transfer ownership of application')) | |||
|
843 | ignore_labels = ['submit'] | |||
|
844 | css_classes = {'submit': 'btn btn-danger'} | |||
|
845 | ||||
|
846 | ||||
|
847 | def default_filename(): | |||
|
848 | return 'Invoice %s' % datetime.datetime.utcnow().strftime('%Y/%m') | |||
|
849 | ||||
|
850 | ||||
|
851 | class FileUploadForm(ReactorForm): | |||
|
852 | title = wtforms.StringField('File Title', | |||
|
853 | default=default_filename, | |||
|
854 | validators=[wtforms.validators.DataRequired()]) | |||
|
855 | file = wtforms.FileField('File') | |||
|
856 | ||||
|
857 | def validate_file(self, field): | |||
|
858 | if not hasattr(field.data, 'file'): | |||
|
859 | raise wtforms.ValidationError('File is missing') | |||
|
860 | ||||
|
861 | submit = wtforms.SubmitField(_('Upload')) | |||
|
862 | ||||
|
863 | ||||
|
864 | def get_partition_deletion_form(es_indices, pg_indices): | |||
|
865 | class F(ReactorForm): | |||
|
866 | es_index = wtforms.SelectMultipleField('Elasticsearch', | |||
|
867 | choices=[(ix, '') for ix in | |||
|
868 | es_indices]) | |||
|
869 | pg_index = wtforms.SelectMultipleField('pg', | |||
|
870 | choices=[(ix, '') for ix in | |||
|
871 | pg_indices]) | |||
|
872 | confirm = wtforms.TextField('Confirm', | |||
|
873 | filters=[uppercase_filter, strip_filter], | |||
|
874 | validators=[ | |||
|
875 | wtforms.validators.AnyOf(['CONFIRM']), | |||
|
876 | wtforms.validators.DataRequired()]) | |||
|
877 | ignore_labels = ['submit'] | |||
|
878 | css_classes = {'submit': 'btn btn-danger'} | |||
|
879 | ||||
|
880 | return F | |||
|
881 | ||||
|
882 | ||||
|
883 | class GroupCreateForm(ReactorForm): | |||
|
884 | group_name = wtforms.StringField( | |||
|
885 | _('Group Name'), | |||
|
886 | filters=[strip_filter], | |||
|
887 | validators=[ | |||
|
888 | wtforms.validators.Length(min=2, max=50), | |||
|
889 | unique_groupname_validator, | |||
|
890 | wtforms.validators.DataRequired() | |||
|
891 | ]) | |||
|
892 | description = wtforms.StringField(_('Group description')) | |||
|
893 | ||||
|
894 | ||||
|
895 | time_choices = [(k, v['label'],) for k, v in h.time_deltas.items()] | |||
|
896 | ||||
|
897 | ||||
|
898 | class AuthTokenCreateForm(ReactorForm): | |||
|
899 | description = wtforms.StringField(_('Token description')) | |||
|
900 | expires = wtforms.SelectField('Expires', | |||
|
901 | coerce=lambda x: x, | |||
|
902 | choices=time_choices, | |||
|
903 | validators=[wtforms.validators.Optional()]) |
@@ -0,0 +1,1 b'' | |||||
|
1 | Generic single-database configuration. No newline at end of file |
@@ -0,0 +1,22 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 |
@@ -0,0 +1,103 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from alembic import context | |||
|
23 | from sqlalchemy import engine_from_config, pool, MetaData | |||
|
24 | from logging.config import fileConfig | |||
|
25 | from appenlight.models import NAMING_CONVENTION | |||
|
26 | ||||
|
27 | # this is the Alembic Config object, which provides | |||
|
28 | # access to the values within the .ini file in use. | |||
|
29 | config = context.config | |||
|
30 | ||||
|
31 | # Interpret the config file for Python logging. | |||
|
32 | # This line sets up loggers basically. | |||
|
33 | if config.config_file_name: | |||
|
34 | fileConfig(config.config_file_name) | |||
|
35 | ||||
|
36 | # add your model's MetaData object here | |||
|
37 | # for 'autogenerate' support | |||
|
38 | # from myapp import mymodel | |||
|
39 | # target_metadata = mymodel.Base.metadata | |||
|
40 | ||||
|
41 | ||||
|
42 | target_metadata = MetaData(naming_convention=NAMING_CONVENTION) | |||
|
43 | ||||
|
44 | # other values from the config, defined by the needs of env.py, | |||
|
45 | # can be acquired: | |||
|
46 | # my_important_option = config.get_main_option("my_important_option") | |||
|
47 | # ... etc. | |||
|
48 | ||||
|
49 | VERSION_TABLE_NAME = 'alembic_appenlight_version' | |||
|
50 | ||||
|
51 | ||||
|
52 | def run_migrations_offline(): | |||
|
53 | """Run migrations in 'offline' mode. | |||
|
54 | ||||
|
55 | This configures the context with just a URL | |||
|
56 | and not an Engine, though an Engine is acceptable | |||
|
57 | here as well. By skipping the Engine creation | |||
|
58 | we don't even need a DBAPI to be available. | |||
|
59 | ||||
|
60 | Calls to context.execute() here emit the given string to the | |||
|
61 | script output. | |||
|
62 | ||||
|
63 | """ | |||
|
64 | url = config.get_main_option("sqlalchemy.url") | |||
|
65 | context.configure(url=url, target_metadata=target_metadata, | |||
|
66 | transaction_per_migration=True, | |||
|
67 | version_table=VERSION_TABLE_NAME) | |||
|
68 | ||||
|
69 | with context.begin_transaction(): | |||
|
70 | context.run_migrations() | |||
|
71 | ||||
|
72 | ||||
|
73 | def run_migrations_online(): | |||
|
74 | """Run migrations in 'online' mode. | |||
|
75 | ||||
|
76 | In this scenario we need to create an Engine | |||
|
77 | and associate a connection with the context. | |||
|
78 | ||||
|
79 | """ | |||
|
80 | engine = engine_from_config( | |||
|
81 | config.get_section(config.config_ini_section), | |||
|
82 | prefix='sqlalchemy.', | |||
|
83 | poolclass=pool.NullPool) | |||
|
84 | ||||
|
85 | connection = engine.connect() | |||
|
86 | context.configure( | |||
|
87 | connection=connection, | |||
|
88 | target_metadata=target_metadata, | |||
|
89 | transaction_per_migration=True, | |||
|
90 | version_table=VERSION_TABLE_NAME | |||
|
91 | ) | |||
|
92 | ||||
|
93 | try: | |||
|
94 | with context.begin_transaction(): | |||
|
95 | context.run_migrations() | |||
|
96 | finally: | |||
|
97 | connection.close() | |||
|
98 | ||||
|
99 | ||||
|
100 | if context.is_offline_mode(): | |||
|
101 | run_migrations_offline() | |||
|
102 | else: | |||
|
103 | run_migrations_online() |
@@ -0,0 +1,22 b'' | |||||
|
1 | """${message} | |||
|
2 | ||||
|
3 | Revision ID: ${up_revision} | |||
|
4 | Revises: ${down_revision} | |||
|
5 | Create Date: ${create_date} | |||
|
6 | ||||
|
7 | """ | |||
|
8 | ||||
|
9 | # revision identifiers, used by Alembic. | |||
|
10 | revision = ${repr(up_revision)} | |||
|
11 | down_revision = ${repr(down_revision)} | |||
|
12 | ||||
|
13 | from alembic import op | |||
|
14 | import sqlalchemy as sa | |||
|
15 | ${imports if imports else ""} | |||
|
16 | ||||
|
17 | def upgrade(): | |||
|
18 | ${upgrades if upgrades else "pass"} | |||
|
19 | ||||
|
20 | ||||
|
21 | def downgrade(): | |||
|
22 | ${downgrades if downgrades else "pass"} |
This diff has been collapsed as it changes many lines, (629 lines changed) Show them Hide them | |||||
@@ -0,0 +1,629 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | """initial tables | |||
|
23 | ||||
|
24 | Revision ID: 55b6e612672f | |||
|
25 | Revises: None | |||
|
26 | Create Date: 2014-10-13 23:47:38.295159 | |||
|
27 | ||||
|
28 | """ | |||
|
29 | ||||
|
30 | # revision identifiers, used by Alembic. | |||
|
31 | revision = '55b6e612672f' | |||
|
32 | down_revision = None | |||
|
33 | ||||
|
34 | from alembic import op | |||
|
35 | import sqlalchemy as sa | |||
|
36 | ||||
|
37 | ||||
|
38 | def upgrade(): | |||
|
39 | op.add_column('users', sa.Column('first_name', sa.Unicode(25))) | |||
|
40 | op.add_column('users', sa.Column('last_name', sa.Unicode(50))) | |||
|
41 | op.add_column('users', sa.Column('company_name', sa.Unicode(255))) | |||
|
42 | op.add_column('users', sa.Column('company_address', sa.Unicode(255))) | |||
|
43 | op.add_column('users', sa.Column('phone1', sa.Unicode(25))) | |||
|
44 | op.add_column('users', sa.Column('phone2', sa.Unicode(25))) | |||
|
45 | op.add_column('users', sa.Column('zip_code', sa.Unicode(25))) | |||
|
46 | op.add_column('users', sa.Column('default_report_sort', sa.Unicode(20), nullable=False, server_default="newest")) | |||
|
47 | op.add_column('users', sa.Column('city', sa.Unicode(128))) | |||
|
48 | op.add_column('users', sa.Column('notes', sa.UnicodeText, server_default='')) | |||
|
49 | op.add_column('users', sa.Column('notifications', sa.Boolean(), nullable=False, server_default='true')) | |||
|
50 | op.add_column('users', sa.Column('registration_ip', sa.Unicode(40), nullable=False, server_default='')) | |||
|
51 | ||||
|
52 | op.create_table( | |||
|
53 | 'integrations', | |||
|
54 | sa.Column('id', sa.Integer(), primary_key=True), | |||
|
55 | sa.Column('resource_id', sa.Integer(), | |||
|
56 | sa.ForeignKey('resources.resource_id', onupdate='cascade', | |||
|
57 | ondelete='cascade')), | |||
|
58 | sa.Column('integration_name', sa.Unicode(64)), | |||
|
59 | sa.Column('config', sa.dialects.postgresql.JSON, nullable=False), | |||
|
60 | sa.Column('modified_date', sa.DateTime(), nullable=False, server_default=sa.func.now()), | |||
|
61 | sa.Column('external_id', sa.Unicode(255)), | |||
|
62 | sa.Column('external_id2', sa.Unicode(255)) | |||
|
63 | ) | |||
|
64 | ||||
|
65 | op.create_table( | |||
|
66 | 'alert_channels', | |||
|
67 | sa.Column('owner_id', sa.Integer(), | |||
|
68 | sa.ForeignKey('users.id', onupdate='cascade', | |||
|
69 | ondelete='cascade'), nullable=False), | |||
|
70 | sa.Column('channel_name', sa.Unicode(25), nullable=False), | |||
|
71 | sa.Column('channel_value', sa.Unicode(80), nullable=False), | |||
|
72 | sa.Column('channel_json_conf', sa.dialects.postgresql.JSON, nullable=False), | |||
|
73 | sa.Column('channel_validated', sa.Boolean, nullable=False, server_default='False'), | |||
|
74 | sa.Column('send_alerts', sa.Boolean, nullable=False, server_default='True'), | |||
|
75 | sa.Column('notify_only_first', sa.Boolean, nullable=False, server_default='False'), | |||
|
76 | sa.Column('daily_digest', sa.Boolean, nullable=False, server_default='True'), | |||
|
77 | sa.Column('pkey', sa.Integer(), primary_key=True), | |||
|
78 | sa.Column('integration_id', sa.Integer, | |||
|
79 | sa.ForeignKey('integrations.id', onupdate='cascade', | |||
|
80 | ondelete='cascade')), | |||
|
81 | ) | |||
|
82 | op.create_unique_constraint('uq_alert_channels', 'alert_channels', | |||
|
83 | ["owner_id", "channel_name", "channel_value"]) | |||
|
84 | ||||
|
85 | op.create_table( | |||
|
86 | 'alert_channels_actions', | |||
|
87 | sa.Column('owner_id', sa.Integer(), nullable=False), | |||
|
88 | sa.Column('resource_id', sa.Integer(), | |||
|
89 | sa.ForeignKey('resources.resource_id', onupdate='cascade', | |||
|
90 | ondelete='cascade')), | |||
|
91 | sa.Column('pkey', sa.Integer(), primary_key=True), | |||
|
92 | sa.Column('action', sa.Unicode(10), nullable=False, server_default='always'), | |||
|
93 | sa.Column('rule', sa.dialects.postgresql.JSON), | |||
|
94 | sa.Column('type', sa.Unicode(10), index=True), | |||
|
95 | sa.Column('other_id', sa.Unicode(40), index=True), | |||
|
96 | sa.Column('config', sa.dialects.postgresql.JSON), | |||
|
97 | sa.Column('name', sa.Unicode(255), server_default='') | |||
|
98 | ) | |||
|
99 | ||||
|
100 | ||||
|
101 | op.create_table( | |||
|
102 | 'application_postprocess_conf', | |||
|
103 | sa.Column('pkey', sa.Integer(), primary_key=True), | |||
|
104 | sa.Column('do', sa.Unicode(25), nullable=False), | |||
|
105 | sa.Column('new_value', sa.UnicodeText(), nullable=False, server_default=''), | |||
|
106 | sa.Column('resource_id', sa.Integer(), | |||
|
107 | sa.ForeignKey('resources.resource_id', | |||
|
108 | onupdate='cascade', | |||
|
109 | ondelete='cascade'), nullable=False), | |||
|
110 | sa.Column('rule', sa.dialects.postgresql.JSON), | |||
|
111 | ) | |||
|
112 | ||||
|
113 | op.create_table( | |||
|
114 | 'applications', | |||
|
115 | sa.Column('resource_id', sa.Integer(), | |||
|
116 | sa.ForeignKey('resources.resource_id', onupdate='cascade', | |||
|
117 | ondelete='cascade'), nullable=False, | |||
|
118 | primary_key=True, autoincrement=False), | |||
|
119 | sa.Column('domains', sa.UnicodeText, nullable=False), | |||
|
120 | sa.Column('api_key', sa.Unicode(32), nullable=False, index=True), | |||
|
121 | sa.Column('default_grouping', sa.Unicode(20), nullable=False, server_default='url_type'), | |||
|
122 | sa.Column('public_key', sa.Unicode(32), nullable=False, index=True), | |||
|
123 | sa.Column('error_report_threshold', sa.Integer(), server_default='10', nullable=False), | |||
|
124 | sa.Column('slow_report_threshold', sa.Integer(), server_default='10', nullable=False), | |||
|
125 | sa.Column('apdex_threshold', sa.Float(), server_default='0.7', nullable=False), | |||
|
126 | sa.Column('allow_permanent_storage', sa.Boolean(), server_default="false", nullable=False), | |||
|
127 | ) | |||
|
128 | op.create_unique_constraint(None, 'applications', | |||
|
129 | ["public_key"]) | |||
|
130 | op.create_unique_constraint(None, 'applications', | |||
|
131 | ["api_key"]) | |||
|
132 | ||||
|
133 | op.create_table( | |||
|
134 | 'metrics', | |||
|
135 | sa.Column('pkey', sa.types.BigInteger, nullable=False, primary_key=True), | |||
|
136 | sa.Column('resource_id', sa.Integer(), | |||
|
137 | sa.ForeignKey('resources.resource_id', | |||
|
138 | onupdate='cascade', | |||
|
139 | ondelete='cascade')), | |||
|
140 | sa.Column('timestamp', sa.DateTime), | |||
|
141 | sa.Column('namespace', sa.Unicode(255)), | |||
|
142 | sa.Column('tags', sa.dialects.postgresql.JSON, server_default="{}") | |||
|
143 | ) | |||
|
144 | ||||
|
145 | op.create_table( | |||
|
146 | 'events', | |||
|
147 | sa.Column('id', sa.Integer, nullable=False, primary_key=True), | |||
|
148 | sa.Column('start_date', sa.DateTime, nullable=False, index=True), | |||
|
149 | sa.Column('end_date', sa.DateTime), | |||
|
150 | sa.Column('status', sa.Integer(), nullable=False, index=True), | |||
|
151 | sa.Column('event_type', sa.Integer(), nullable=False, index=True), | |||
|
152 | sa.Column('origin_user_id', sa.Integer()), | |||
|
153 | sa.Column('target_user_id', sa.Integer()), | |||
|
154 | sa.Column('resource_id', sa.Integer(), index=True), | |||
|
155 | sa.Column('text', sa.UnicodeText, server_default=''), | |||
|
156 | sa.Column('values', sa.dialects.postgresql.JSON), | |||
|
157 | sa.Column('target_id', sa.Integer()), | |||
|
158 | sa.Column('target_uuid', sa.Unicode(40), index=True) | |||
|
159 | ) | |||
|
160 | ||||
|
161 | op.create_table( | |||
|
162 | 'logs', | |||
|
163 | sa.Column('log_id', sa.types.BigInteger, nullable=False, primary_key=True), | |||
|
164 | sa.Column('resource_id', sa.Integer(), | |||
|
165 | sa.ForeignKey('resources.resource_id', | |||
|
166 | onupdate='cascade', | |||
|
167 | ondelete='cascade')), | |||
|
168 | sa.Column('log_level', sa.SmallInteger(), nullable=False), | |||
|
169 | sa.Column('primary_key', sa.Unicode(128), nullable=True), | |||
|
170 | sa.Column('message', sa.UnicodeText, nullable=False, server_default=''), | |||
|
171 | sa.Column('timestamp', sa.DateTime), | |||
|
172 | sa.Column('namespace', sa.Unicode(255)), | |||
|
173 | sa.Column('request_id', sa.Unicode(40)), | |||
|
174 | sa.Column('tags', sa.dialects.postgresql.JSON, server_default="{}"), | |||
|
175 | sa.Column('permanent', sa.Boolean(), server_default="false", | |||
|
176 | nullable=False) | |||
|
177 | ) | |||
|
178 | ||||
|
179 | op.create_table( | |||
|
180 | 'reports_groups', | |||
|
181 | sa.Column('id', sa.types.BigInteger, primary_key=True), | |||
|
182 | sa.Column('resource_id', sa.Integer, | |||
|
183 | sa.ForeignKey('resources.resource_id', onupdate='cascade', | |||
|
184 | ondelete='cascade'), nullable=False), | |||
|
185 | sa.Column('priority', sa.Integer, nullable=False, server_default="5"), | |||
|
186 | sa.Column('first_timestamp', sa.DateTime(), nullable=False, server_default=sa.func.now()), | |||
|
187 | sa.Column('last_timestamp', sa.DateTime()), | |||
|
188 | sa.Column('error', sa.UnicodeText, nullable=False, server_default=""), | |||
|
189 | sa.Column('grouping_hash', sa.Unicode(40), nullable=False, server_default=""), | |||
|
190 | sa.Column('triggered_postprocesses_ids', sa.dialects.postgresql.JSON, nullable=False, server_default="[]"), | |||
|
191 | sa.Column('report_type', sa.Integer, nullable=False, server_default="0"), | |||
|
192 | sa.Column('total_reports', sa.Integer, nullable=False, server_default="0"), | |||
|
193 | sa.Column('last_report', sa.Integer, nullable=False, server_default="0"), | |||
|
194 | sa.Column('occurences', sa.Integer, nullable=False, server_default="1"), | |||
|
195 | sa.Column('average_duration', sa.Float(), nullable=False, server_default="0"), | |||
|
196 | sa.Column('summed_duration', sa.Float(), nullable=False, server_default="0"), | |||
|
197 | sa.Column('notified', sa.Boolean, nullable=False, server_default="False"), | |||
|
198 | sa.Column('fixed', sa.Boolean, nullable=False, server_default="False"), | |||
|
199 | sa.Column('public', sa.Boolean, nullable=False, server_default="False"), | |||
|
200 | sa.Column('read', sa.Boolean, nullable=False, server_default="False"), | |||
|
201 | ) | |||
|
202 | ||||
|
203 | op.create_table( | |||
|
204 | 'reports', | |||
|
205 | sa.Column('id', sa.types.BigInteger, primary_key=True), | |||
|
206 | sa.Column('group_id', sa.types.BigInteger, | |||
|
207 | sa.ForeignKey('reports_groups.id', onupdate='cascade', | |||
|
208 | ondelete='cascade'), nullable=False, index=True), | |||
|
209 | sa.Column('resource_id', sa.Integer, nullable=False, index=True), | |||
|
210 | sa.Column('report_type', sa.Integer, nullable=False, server_default="0"), | |||
|
211 | sa.Column('error', sa.UnicodeText, nullable=False, server_default=""), | |||
|
212 | sa.Column('extra', sa.dialects.postgresql.JSON, nullable=False, server_default="{}"), | |||
|
213 | sa.Column('request', sa.dialects.postgresql.JSON, nullable=False, server_default="{}"), | |||
|
214 | sa.Column('tags', sa.dialects.postgresql.JSON, nullable=False, server_default="{}"), | |||
|
215 | sa.Column('ip', sa.Unicode(39), nullable=False, server_default=""), | |||
|
216 | sa.Column('username', sa.Unicode(255), nullable=False, server_default=""), | |||
|
217 | sa.Column('user_agent', sa.Unicode(512), nullable=False, server_default=""), | |||
|
218 | sa.Column('url', sa.UnicodeText, nullable=False, server_default=""), | |||
|
219 | sa.Column('request_id', sa.Unicode(40), nullable=False, server_default=""), | |||
|
220 | sa.Column('request_stats', sa.dialects.postgresql.JSON, nullable=False, server_default="{}"), | |||
|
221 | sa.Column('traceback', sa.dialects.postgresql.JSON, nullable=False, server_default="{}"), | |||
|
222 | sa.Column('traceback_hash', sa.Unicode(40), nullable=False, server_default=""), | |||
|
223 | sa.Column('start_time', sa.DateTime(), nullable=False, server_default=sa.func.now()), | |||
|
224 | sa.Column('end_time', sa.DateTime()), | |||
|
225 | sa.Column('report_group_time', sa.DateTime, index=True, nullable=False, server_default=sa.func.now()), | |||
|
226 | sa.Column('duration', sa.Float(), nullable=False, server_default="0"), | |||
|
227 | sa.Column('http_status', sa.Integer, index=True), | |||
|
228 | sa.Column('url_domain', sa.Unicode(128)), | |||
|
229 | sa.Column('url_path', sa.UnicodeText), | |||
|
230 | sa.Column('language', sa.Integer, server_default="0"), | |||
|
231 | ) | |||
|
232 | op.create_index(None, 'reports', | |||
|
233 | [sa.text("(tags ->> 'server_name')")]) | |||
|
234 | op.create_index(None, 'reports', | |||
|
235 | [sa.text("(tags ->> 'view_name')")]) | |||
|
236 | ||||
|
237 | op.create_table( | |||
|
238 | 'reports_assignments', | |||
|
239 | sa.Column('group_id', sa.types.BigInteger, nullable=False, primary_key=True), | |||
|
240 | sa.Column('owner_id', sa.Integer, | |||
|
241 | sa.ForeignKey('users.id', onupdate='cascade',ondelete='cascade'), | |||
|
242 | nullable=False, primary_key=True), | |||
|
243 | sa.Column('report_time', sa.DateTime, nullable=False) | |||
|
244 | ) | |||
|
245 | ||||
|
246 | op.create_table( | |||
|
247 | 'reports_comments', | |||
|
248 | sa.Column('comment_id', sa.Integer, primary_key=True), | |||
|
249 | sa.Column('body', sa.UnicodeText, nullable=False, server_default=''), | |||
|
250 | sa.Column('owner_id', sa.Integer, | |||
|
251 | sa.ForeignKey('users.id', onupdate='cascade', | |||
|
252 | ondelete='set null'), nullable=True), | |||
|
253 | sa.Column('created_timestamp', sa.DateTime, nullable=False, server_default=sa.func.now()), | |||
|
254 | sa.Column('report_time', sa.DateTime, nullable=False), | |||
|
255 | sa.Column('group_id', sa.types.BigInteger, nullable=False) | |||
|
256 | ) | |||
|
257 | ||||
|
258 | op.create_table( | |||
|
259 | 'reports_stats', | |||
|
260 | sa.Column('resource_id', sa.Integer, nullable=False, index=True), | |||
|
261 | sa.Column('start_interval', sa.DateTime, nullable=False, index=True), | |||
|
262 | sa.Column('group_id', sa.types.BigInteger, index=True), | |||
|
263 | sa.Column('occurences', sa.Integer, nullable=False, server_default='0', index=True), | |||
|
264 | sa.Column('owner_user_id', sa.Integer), | |||
|
265 | sa.Column('type', sa.Integer, index=True, nullable=False), | |||
|
266 | sa.Column('duration', sa.Float(), server_default='0'), | |||
|
267 | sa.Column('server_name', sa.Unicode(128), | |||
|
268 | server_default=''), | |||
|
269 | sa.Column('view_name', sa.Unicode(128), | |||
|
270 | server_default=''), | |||
|
271 | sa.Column('id', sa.BigInteger(), nullable=False, primary_key=True), | |||
|
272 | ) | |||
|
273 | op.create_index('ix_reports_stats_start_interval_group_id', 'reports_stats', | |||
|
274 | ["start_interval", "group_id"]) | |||
|
275 | ||||
|
276 | op.create_table( | |||
|
277 | 'slow_calls', | |||
|
278 | sa.Column('id', sa.types.BigInteger, primary_key=True), | |||
|
279 | sa.Column('report_id', sa.types.BigInteger, sa.ForeignKey('reports.id', onupdate='cascade', ondelete='cascade'), | |||
|
280 | nullable=False, index=True), | |||
|
281 | sa.Column('duration', sa.Float(), nullable=False, server_default="0", index=True), | |||
|
282 | sa.Column('timestamp', sa.DateTime, nullable=False, server_default=sa.func.now(), index=True), | |||
|
283 | sa.Column('report_group_time', sa.DateTime, index=True, nullable=False, server_default=sa.func.now()), | |||
|
284 | sa.Column('type', sa.Unicode(16), nullable=False, index=True), | |||
|
285 | sa.Column('statement', sa.UnicodeText, nullable=False, server_default=''), | |||
|
286 | sa.Column('parameters', sa.dialects.postgresql.JSON, nullable=False), | |||
|
287 | sa.Column('location', sa.UnicodeText, server_default=''), | |||
|
288 | sa.Column('subtype', sa.Unicode(16), nullable=False, index=True), | |||
|
289 | sa.Column('resource_id', sa.Integer, nullable=False, index=True), | |||
|
290 | sa.Column('statement_hash', sa.Unicode(60), index=True) | |||
|
291 | ) | |||
|
292 | ||||
|
293 | op.create_table( | |||
|
294 | 'tags', | |||
|
295 | sa.Column('id', sa.types.BigInteger, primary_key=True), | |||
|
296 | sa.Column('resource_id', sa.Integer, | |||
|
297 | sa.ForeignKey('resources.resource_id', onupdate='cascade', | |||
|
298 | ondelete='cascade')), | |||
|
299 | sa.Column('first_timestamp', sa.DateTime, nullable=False, server_default=sa.func.now()), | |||
|
300 | sa.Column('last_timestamp', sa.DateTime, nullable=False, server_default=sa.func.now()), | |||
|
301 | sa.Column('name', sa.Unicode(32), nullable=False), | |||
|
302 | sa.Column('value', sa.dialects.postgresql.JSON, nullable=False), | |||
|
303 | sa.Column('times_seen', sa.Integer, nullable=False, server_default='1') | |||
|
304 | ) | |||
|
305 | ||||
|
306 | op.create_table( | |||
|
307 | 'auth_tokens', | |||
|
308 | sa.Column('id', sa.Integer, nullable=False, primary_key=True), | |||
|
309 | sa.Column('token', sa.Unicode), | |||
|
310 | sa.Column('creation_date', sa.DateTime, nullable=False, server_default=sa.func.now()), | |||
|
311 | sa.Column('expires', sa.DateTime), | |||
|
312 | sa.Column('owner_id', sa.Integer, | |||
|
313 | sa.ForeignKey('users.id', onupdate='cascade', | |||
|
314 | ondelete='cascade')), | |||
|
315 | sa.Column('description', sa.Unicode), | |||
|
316 | ) | |||
|
317 | ||||
|
318 | op.create_table( | |||
|
319 | 'channels_actions', | |||
|
320 | sa.Column('channel_pkey', sa.Integer, | |||
|
321 | sa.ForeignKey('alert_channels.pkey', | |||
|
322 | ondelete='CASCADE', onupdate='CASCADE')), | |||
|
323 | sa.Column('action_pkey', sa.Integer, | |||
|
324 | sa.ForeignKey('alert_channels_actions.pkey', | |||
|
325 | ondelete='CASCADE', onupdate='CASCADE')) | |||
|
326 | ) | |||
|
327 | ||||
|
328 | op.create_table( | |||
|
329 | 'config', | |||
|
330 | sa.Column('key', sa.Unicode(128), primary_key=True), | |||
|
331 | sa.Column('section', sa.Unicode(128), primary_key=True), | |||
|
332 | sa.Column('value', sa.dialects.postgresql.JSON, | |||
|
333 | server_default="{}") | |||
|
334 | ) | |||
|
335 | ||||
|
336 | op.create_table( | |||
|
337 | 'plugin_configs', | |||
|
338 | sa.Column('id', sa.Integer, primary_key=True), | |||
|
339 | sa.Column('plugin_name', sa.Unicode(128)), | |||
|
340 | sa.Column('section', sa.Unicode(128)), | |||
|
341 | sa.Column('config', sa.dialects.postgresql.JSON, | |||
|
342 | server_default="{}"), | |||
|
343 | sa.Column('resource_id', sa.Integer(), | |||
|
344 | sa.ForeignKey('resources.resource_id', onupdate='cascade', | |||
|
345 | ondelete='cascade')), | |||
|
346 | sa.Column('owner_id', sa.Integer(), | |||
|
347 | sa.ForeignKey('users.id', onupdate='cascade', | |||
|
348 | ondelete='cascade'))) | |||
|
349 | ||||
|
350 | op.create_table( | |||
|
351 | 'rc_versions', | |||
|
352 | sa.Column('name', sa.Unicode(40), primary_key=True), | |||
|
353 | sa.Column('value', sa.Unicode(40)), | |||
|
354 | ) | |||
|
355 | version_table = sa.table('rc_versions', | |||
|
356 | sa.Column('name', sa.Unicode(40)), | |||
|
357 | sa.Column('value', sa.Unicode(40))) | |||
|
358 | ||||
|
359 | insert = version_table.insert().values(name='es_reports') | |||
|
360 | op.execute(insert) | |||
|
361 | insert = version_table.insert().values(name='es_reports_groups') | |||
|
362 | op.execute(insert) | |||
|
363 | insert = version_table.insert().values(name='es_reports_stats') | |||
|
364 | op.execute(insert) | |||
|
365 | insert = version_table.insert().values(name='es_logs') | |||
|
366 | op.execute(insert) | |||
|
367 | insert = version_table.insert().values(name='es_metrics') | |||
|
368 | op.execute(insert) | |||
|
369 | insert = version_table.insert().values(name='es_slow_calls') | |||
|
370 | op.execute(insert) | |||
|
371 | ||||
|
372 | ||||
|
373 | op.execute(''' | |||
|
374 | CREATE OR REPLACE FUNCTION floor_time_5min(timestamp without time zone) | |||
|
375 | RETURNS timestamp without time zone AS | |||
|
376 | $BODY$SELECT date_trunc('hour', $1) + INTERVAL '5 min' * FLOOR(date_part('minute', $1) / 5.0)$BODY$ | |||
|
377 | LANGUAGE sql VOLATILE; | |||
|
378 | ''') | |||
|
379 | ||||
|
380 | op.execute(''' | |||
|
381 | CREATE OR REPLACE FUNCTION partition_logs() RETURNS trigger | |||
|
382 | LANGUAGE plpgsql SECURITY DEFINER | |||
|
383 | AS $$ | |||
|
384 | DECLARE | |||
|
385 | main_table varchar := 'logs'; | |||
|
386 | partitioned_table varchar := ''; | |||
|
387 | BEGIN | |||
|
388 | ||||
|
389 | IF NEW.permanent THEN | |||
|
390 | partitioned_table := main_table || '_p_' || date_part('year', NEW.timestamp)::TEXT || '_' || DATE_part('month', NEW.timestamp); | |||
|
391 | ELSE | |||
|
392 | partitioned_table := main_table || '_p_' || date_part('year', NEW.timestamp)::TEXT || '_' || DATE_part('month', NEW.timestamp) || '_' || DATE_part('day', NEW.timestamp); | |||
|
393 | END IF; | |||
|
394 | ||||
|
395 | BEGIN | |||
|
396 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; | |||
|
397 | EXCEPTION | |||
|
398 | WHEN undefined_table THEN | |||
|
399 | RAISE NOTICE 'A partition has been created %', partitioned_table; | |||
|
400 | IF NEW.permanent THEN | |||
|
401 | EXECUTE format('CREATE TABLE IF NOT EXISTS %s ( CHECK( timestamp >= DATE %s AND timestamp < DATE %s)) INHERITS (%s)', | |||
|
402 | partitioned_table, | |||
|
403 | quote_literal(date_trunc('month', NEW.timestamp)::date) , | |||
|
404 | quote_literal((date_trunc('month', NEW.timestamp)::date + interval '1 month')::text), | |||
|
405 | main_table); | |||
|
406 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT pk_%s PRIMARY KEY(log_id);', partitioned_table, partitioned_table); | |||
|
407 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT fk_%s_resource_id FOREIGN KEY (resource_id) REFERENCES resources (resource_id) MATCH SIMPLE ON UPDATE CASCADE ON DELETE CASCADE;', partitioned_table, partitioned_table); | |||
|
408 | EXECUTE format('CREATE INDEX ix_%s_timestamp ON %s (timestamp);', partitioned_table, partitioned_table); | |||
|
409 | EXECUTE format('CREATE INDEX ix_%s_namespace_resource_id ON %s (namespace, resource_id);', partitioned_table, partitioned_table); | |||
|
410 | EXECUTE format('CREATE INDEX ix_%s_resource_id ON %s (resource_id);', partitioned_table, partitioned_table); | |||
|
411 | EXECUTE format('CREATE INDEX ix_%s_pkey_namespace ON %s (primary_key, namespace);', partitioned_table, partitioned_table); | |||
|
412 | ELSE | |||
|
413 | EXECUTE format('CREATE TABLE IF NOT EXISTS %s ( CHECK( timestamp >= DATE %s AND timestamp < DATE %s)) INHERITS (%s)', | |||
|
414 | partitioned_table, | |||
|
415 | quote_literal(date_trunc('day', NEW.timestamp)::date) , | |||
|
416 | quote_literal((date_trunc('day', NEW.timestamp)::date + interval '1 day')::text), | |||
|
417 | main_table); | |||
|
418 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT pk_%s_ PRIMARY KEY(log_id);', partitioned_table, partitioned_table); | |||
|
419 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT fk_%s_resource_id FOREIGN KEY (resource_id) REFERENCES resources (resource_id) MATCH SIMPLE ON UPDATE CASCADE ON DELETE CASCADE;', partitioned_table, partitioned_table); | |||
|
420 | EXECUTE format('CREATE INDEX ix_%s_timestamp ON %s (timestamp);', partitioned_table, partitioned_table); | |||
|
421 | EXECUTE format('CREATE INDEX ix_%s_namespace_resource_id ON %s (namespace, resource_id);', partitioned_table, partitioned_table); | |||
|
422 | EXECUTE format('CREATE INDEX ix_%s_resource_id ON %s (resource_id);', partitioned_table, partitioned_table); | |||
|
423 | EXECUTE format('CREATE INDEX ix_%s_primary_key_namespace ON %s (primary_key,namespace);', partitioned_table, partitioned_table); | |||
|
424 | END IF; | |||
|
425 | ||||
|
426 | ||||
|
427 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; | |||
|
428 | END; | |||
|
429 | ||||
|
430 | ||||
|
431 | RETURN NULL; | |||
|
432 | END | |||
|
433 | $$; | |||
|
434 | ''') | |||
|
435 | ||||
|
436 | op.execute(''' | |||
|
437 | CREATE TRIGGER partition_logs BEFORE INSERT ON logs FOR EACH ROW EXECUTE PROCEDURE partition_logs(); | |||
|
438 | ''') | |||
|
439 | ||||
|
440 | op.execute(''' | |||
|
441 | CREATE OR REPLACE FUNCTION partition_metrics() RETURNS trigger | |||
|
442 | LANGUAGE plpgsql SECURITY DEFINER | |||
|
443 | AS $$ | |||
|
444 | DECLARE | |||
|
445 | main_table varchar := 'metrics'; | |||
|
446 | partitioned_table varchar := ''; | |||
|
447 | BEGIN | |||
|
448 | ||||
|
449 | partitioned_table := main_table || '_p_' || date_part('year', NEW.timestamp)::TEXT || '_' || DATE_part('month', NEW.timestamp) || '_' || DATE_part('day', NEW.timestamp); | |||
|
450 | ||||
|
451 | BEGIN | |||
|
452 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; | |||
|
453 | EXCEPTION | |||
|
454 | WHEN undefined_table THEN | |||
|
455 | RAISE NOTICE 'A partition has been created %', partitioned_table; | |||
|
456 | EXECUTE format('CREATE TABLE IF NOT EXISTS %s ( CHECK( timestamp >= DATE %s AND timestamp < DATE %s)) INHERITS (%s)', | |||
|
457 | partitioned_table, | |||
|
458 | quote_literal(date_trunc('day', NEW.timestamp)::date) , | |||
|
459 | quote_literal((date_trunc('day', NEW.timestamp)::date + interval '1 day')::text), | |||
|
460 | main_table); | |||
|
461 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT pk_%s PRIMARY KEY(pkey);', partitioned_table, partitioned_table); | |||
|
462 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT fk_%s_resource_id FOREIGN KEY (resource_id) REFERENCES resources (resource_id) MATCH SIMPLE ON UPDATE CASCADE ON DELETE CASCADE;', partitioned_table, partitioned_table); | |||
|
463 | EXECUTE format('CREATE INDEX ix_%s_timestamp ON %s (timestamp);', partitioned_table, partitioned_table); | |||
|
464 | EXECUTE format('CREATE INDEX ix_%s_resource_id ON %s (resource_id);', partitioned_table, partitioned_table); | |||
|
465 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; | |||
|
466 | END; | |||
|
467 | ||||
|
468 | RETURN NULL; | |||
|
469 | END | |||
|
470 | $$; | |||
|
471 | ''') | |||
|
472 | ||||
|
473 | op.execute(''' | |||
|
474 | CREATE TRIGGER partition_metrics BEFORE INSERT ON metrics FOR EACH ROW EXECUTE PROCEDURE partition_metrics(); | |||
|
475 | ''') | |||
|
476 | ||||
|
477 | op.execute(''' | |||
|
478 | CREATE FUNCTION partition_reports_stats() RETURNS trigger | |||
|
479 | LANGUAGE plpgsql SECURITY DEFINER | |||
|
480 | AS $$ | |||
|
481 | DECLARE | |||
|
482 | main_table varchar := 'reports_stats'; | |||
|
483 | partitioned_table varchar := ''; | |||
|
484 | BEGIN | |||
|
485 | ||||
|
486 | partitioned_table := main_table || '_p_' || date_part('year', NEW.start_interval)::TEXT || '_' || DATE_part('month', NEW.start_interval); | |||
|
487 | ||||
|
488 | BEGIN | |||
|
489 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; | |||
|
490 | EXCEPTION | |||
|
491 | WHEN undefined_table THEN | |||
|
492 | RAISE NOTICE 'A partition has been created %', partitioned_table; | |||
|
493 | EXECUTE format('CREATE TABLE IF NOT EXISTS %s ( CHECK( start_interval >= DATE %s AND start_interval < DATE %s )) INHERITS (%s)', | |||
|
494 | partitioned_table, | |||
|
495 | quote_literal(date_trunc('month', NEW.start_interval)::date) , | |||
|
496 | quote_literal((date_trunc('month', NEW.start_interval)::date + interval '1 month')::text), | |||
|
497 | main_table); | |||
|
498 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT pk_%s PRIMARY KEY(id);', partitioned_table, partitioned_table); | |||
|
499 | EXECUTE format('CREATE INDEX ix_%s_start_interval ON %s USING btree (start_interval);', partitioned_table, partitioned_table); | |||
|
500 | EXECUTE format('CREATE INDEX ix_%s_type ON %s USING btree (type);', partitioned_table, partitioned_table); | |||
|
501 | EXECUTE format('CREATE INDEX ix_%s_resource_id ON %s USING btree (resource_id);', partitioned_table, partitioned_table); | |||
|
502 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; | |||
|
503 | END; | |||
|
504 | RETURN NULL; | |||
|
505 | END | |||
|
506 | $$; | |||
|
507 | ''') | |||
|
508 | ||||
|
509 | op.execute(''' | |||
|
510 | CREATE TRIGGER partition_reports_stats BEFORE INSERT ON reports_stats FOR EACH ROW EXECUTE PROCEDURE partition_reports_stats(); | |||
|
511 | ''') | |||
|
512 | ||||
|
513 | op.execute(''' | |||
|
514 | CREATE OR REPLACE FUNCTION partition_reports_groups() RETURNS trigger | |||
|
515 | LANGUAGE plpgsql SECURITY DEFINER | |||
|
516 | AS $$ | |||
|
517 | DECLARE | |||
|
518 | main_table varchar := 'reports_groups'; | |||
|
519 | partitioned_table varchar := ''; | |||
|
520 | BEGIN | |||
|
521 | ||||
|
522 | partitioned_table := main_table || '_p_' || date_part('year', NEW.first_timestamp)::TEXT || '_' || DATE_part('month', NEW.first_timestamp); | |||
|
523 | ||||
|
524 | BEGIN | |||
|
525 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; | |||
|
526 | EXCEPTION | |||
|
527 | WHEN undefined_table THEN | |||
|
528 | RAISE NOTICE 'A partition has been created %', partitioned_table; | |||
|
529 | EXECUTE format('CREATE TABLE IF NOT EXISTS %s ( CHECK( first_timestamp >= DATE %s AND first_timestamp < DATE %s )) INHERITS (%s)', | |||
|
530 | partitioned_table, | |||
|
531 | quote_literal(date_trunc('month', NEW.first_timestamp)::date) , | |||
|
532 | quote_literal((date_trunc('month', NEW.first_timestamp)::date + interval '1 month')::text), | |||
|
533 | main_table); | |||
|
534 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT pk_%s PRIMARY KEY(id);', partitioned_table, partitioned_table); | |||
|
535 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT fk_%s_resource_id FOREIGN KEY (resource_id) REFERENCES resources (resource_id) MATCH SIMPLE ON UPDATE CASCADE ON DELETE CASCADE;', partitioned_table, partitioned_table); | |||
|
536 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; | |||
|
537 | END; | |||
|
538 | RETURN NULL; | |||
|
539 | END | |||
|
540 | $$; | |||
|
541 | ''') | |||
|
542 | ||||
|
543 | op.execute(''' | |||
|
544 | CREATE TRIGGER partition_reports_groups BEFORE INSERT ON reports_groups FOR EACH ROW EXECUTE PROCEDURE partition_reports_groups(); | |||
|
545 | ''') | |||
|
546 | ||||
|
547 | op.execute(''' | |||
|
548 | CREATE OR REPLACE FUNCTION partition_reports() RETURNS trigger | |||
|
549 | LANGUAGE plpgsql SECURITY DEFINER | |||
|
550 | AS $$ | |||
|
551 | DECLARE | |||
|
552 | main_table varchar := 'reports'; | |||
|
553 | partitioned_table varchar := ''; | |||
|
554 | partitioned_parent_table varchar := ''; | |||
|
555 | BEGIN | |||
|
556 | ||||
|
557 | partitioned_table := main_table || '_p_' || date_part('year', NEW.report_group_time)::TEXT || '_' || DATE_part('month', NEW.report_group_time); | |||
|
558 | partitioned_parent_table := 'reports_groups_p_' || date_part('year', NEW.report_group_time)::TEXT || '_' || DATE_part('month', NEW.report_group_time); | |||
|
559 | ||||
|
560 | BEGIN | |||
|
561 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; | |||
|
562 | EXCEPTION | |||
|
563 | WHEN undefined_table THEN | |||
|
564 | RAISE NOTICE 'A partition has been created %', partitioned_table; | |||
|
565 | EXECUTE format('CREATE TABLE IF NOT EXISTS %s ( CHECK( report_group_time >= DATE %s AND report_group_time < DATE %s )) INHERITS (%s)', | |||
|
566 | partitioned_table, | |||
|
567 | quote_literal(date_trunc('month', NEW.report_group_time)::date) , | |||
|
568 | quote_literal((date_trunc('month', NEW.report_group_time)::date + interval '1 month')::text), | |||
|
569 | main_table); | |||
|
570 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT pk_%s PRIMARY KEY(id);', partitioned_table, partitioned_table); | |||
|
571 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT fk_%s_resource_id FOREIGN KEY (resource_id) REFERENCES resources (resource_id) MATCH SIMPLE ON UPDATE CASCADE ON DELETE CASCADE;', partitioned_table, partitioned_table); | |||
|
572 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT fk_%s_group_id FOREIGN KEY (group_id) REFERENCES %s (id) MATCH SIMPLE ON UPDATE CASCADE ON DELETE CASCADE;', partitioned_table, partitioned_table, partitioned_parent_table); | |||
|
573 | EXECUTE format('CREATE INDEX ix_%s_report_group_time ON %s USING btree (report_group_time);', partitioned_table, partitioned_table); | |||
|
574 | EXECUTE format('CREATE INDEX ix_%s_group_id ON %s USING btree (group_id);', partitioned_table, partitioned_table); | |||
|
575 | EXECUTE format('CREATE INDEX ix_%s_resource_id ON %s USING btree (resource_id);', partitioned_table, partitioned_table); | |||
|
576 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; | |||
|
577 | END; | |||
|
578 | RETURN NULL; | |||
|
579 | END | |||
|
580 | $$; | |||
|
581 | ''') | |||
|
582 | ||||
|
583 | op.execute(''' | |||
|
584 | CREATE TRIGGER partition_reports BEFORE INSERT ON reports FOR EACH ROW EXECUTE PROCEDURE partition_reports(); | |||
|
585 | ''') | |||
|
586 | ||||
|
587 | ||||
|
588 | op.execute(''' | |||
|
589 | CREATE OR REPLACE FUNCTION partition_slow_calls() RETURNS trigger | |||
|
590 | LANGUAGE plpgsql SECURITY DEFINER | |||
|
591 | AS $$ | |||
|
592 | DECLARE | |||
|
593 | main_table varchar := 'slow_calls'; | |||
|
594 | partitioned_table varchar := ''; | |||
|
595 | partitioned_parent_table varchar := ''; | |||
|
596 | BEGIN | |||
|
597 | ||||
|
598 | partitioned_table := main_table || '_p_' || date_part('year', NEW.report_group_time)::TEXT || '_' || DATE_part('month', NEW.report_group_time); | |||
|
599 | partitioned_parent_table := 'reports_p_' || date_part('year', NEW.report_group_time)::TEXT || '_' || DATE_part('month', NEW.report_group_time); | |||
|
600 | ||||
|
601 | BEGIN | |||
|
602 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; | |||
|
603 | EXCEPTION | |||
|
604 | WHEN undefined_table THEN | |||
|
605 | RAISE NOTICE 'A partition has been created %', partitioned_table; | |||
|
606 | EXECUTE format('CREATE TABLE IF NOT EXISTS %s ( CHECK( report_group_time >= DATE %s AND report_group_time < DATE %s )) INHERITS (%s)', | |||
|
607 | partitioned_table, | |||
|
608 | quote_literal(date_trunc('month', NEW.report_group_time)::date) , | |||
|
609 | quote_literal((date_trunc('month', NEW.report_group_time)::date + interval '1 month')::text), | |||
|
610 | main_table); | |||
|
611 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT pk_%s PRIMARY KEY(id);', partitioned_table, partitioned_table); | |||
|
612 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT fk_%s_resource_id FOREIGN KEY (resource_id) REFERENCES resources (resource_id) MATCH SIMPLE ON UPDATE CASCADE ON DELETE CASCADE;', partitioned_table, partitioned_table); | |||
|
613 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT fk_%s_report_id FOREIGN KEY (report_id) REFERENCES %s (id) MATCH SIMPLE ON UPDATE CASCADE ON DELETE CASCADE;', partitioned_table, partitioned_table, partitioned_parent_table); | |||
|
614 | EXECUTE format('CREATE INDEX ix_%s_resource_id ON %s USING btree (resource_id);', partitioned_table, partitioned_table); | |||
|
615 | EXECUTE format('CREATE INDEX ix_%s_report_id ON %s USING btree (report_id);', partitioned_table, partitioned_table); | |||
|
616 | EXECUTE format('CREATE INDEX ix_%s_timestamp ON %s USING btree (timestamp);', partitioned_table, partitioned_table); | |||
|
617 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; | |||
|
618 | END; | |||
|
619 | RETURN NULL; | |||
|
620 | END | |||
|
621 | $$; | |||
|
622 | ''') | |||
|
623 | ||||
|
624 | op.execute(''' | |||
|
625 | CREATE TRIGGER partition_slow_calls BEFORE INSERT ON slow_calls FOR EACH ROW EXECUTE PROCEDURE partition_slow_calls(); | |||
|
626 | ''') | |||
|
627 | ||||
|
628 | def downgrade(): | |||
|
629 | pass |
@@ -0,0 +1,135 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | ||||
|
24 | from sqlalchemy.ext.declarative import declarative_base | |||
|
25 | from sqlalchemy import MetaData | |||
|
26 | from sqlalchemy.orm import scoped_session | |||
|
27 | from sqlalchemy.orm import sessionmaker | |||
|
28 | from zope.sqlalchemy import ZopeTransactionExtension | |||
|
29 | import ziggurat_foundations | |||
|
30 | from ziggurat_foundations.models.base import get_db_session | |||
|
31 | ||||
|
32 | log = logging.getLogger(__name__) | |||
|
33 | ||||
|
34 | DBSession = scoped_session(sessionmaker(extension=ZopeTransactionExtension())) | |||
|
35 | ||||
|
36 | NAMING_CONVENTION = { | |||
|
37 | "ix": 'ix_%(column_0_label)s', | |||
|
38 | "uq": "uq_%(table_name)s_%(column_0_name)s", | |||
|
39 | "ck": "ck_%(table_name)s_%(constraint_name)s", | |||
|
40 | "fk": "fk_%(table_name)s_%(column_0_name)s_%(referred_table_name)s", | |||
|
41 | "pk": "pk_%(table_name)s" | |||
|
42 | } | |||
|
43 | ||||
|
44 | metadata = MetaData(naming_convention=NAMING_CONVENTION) | |||
|
45 | Base = declarative_base(metadata=metadata) | |||
|
46 | ||||
|
47 | # optional for request.db approach | |||
|
48 | ziggurat_foundations.models.DBSession = DBSession | |||
|
49 | ||||
|
50 | ||||
|
51 | class Datastores(object): | |||
|
52 | redis = None | |||
|
53 | es = None | |||
|
54 | ||||
|
55 | ||||
|
56 | def register_datastores(es_conn, redis_conn, redis_lockmgr): | |||
|
57 | Datastores.es = es_conn | |||
|
58 | Datastores.redis = redis_conn | |||
|
59 | Datastores.lockmgr = redis_lockmgr | |||
|
60 | ||||
|
61 | ||||
|
62 | class SliceableESQuery(object): | |||
|
63 | def __init__(self, query, sort_query=None, aggregations=False, **kwconfig): | |||
|
64 | self.query = query | |||
|
65 | self.sort_query = sort_query | |||
|
66 | self.aggregations = aggregations | |||
|
67 | self.items_per_page = kwconfig.pop('items_per_page', 10) | |||
|
68 | self.page = kwconfig.pop('page', 1) | |||
|
69 | self.kwconfig = kwconfig | |||
|
70 | self.result = None | |||
|
71 | ||||
|
72 | def __getitem__(self, index): | |||
|
73 | config = self.kwconfig.copy() | |||
|
74 | config['es_from'] = index.start | |||
|
75 | query = self.query.copy() | |||
|
76 | if self.sort_query: | |||
|
77 | query.update(self.sort_query) | |||
|
78 | self.result = Datastores.es.search(query, size=self.items_per_page, | |||
|
79 | **config) | |||
|
80 | if self.aggregations: | |||
|
81 | self.items = self.result.get('aggregations') | |||
|
82 | else: | |||
|
83 | self.items = self.result['hits']['hits'] | |||
|
84 | ||||
|
85 | return self.items | |||
|
86 | ||||
|
87 | def __iter__(self): | |||
|
88 | return self.result | |||
|
89 | ||||
|
90 | def __len__(self): | |||
|
91 | config = self.kwconfig.copy() | |||
|
92 | query = self.query.copy() | |||
|
93 | self.result = Datastores.es.search(query, size=self.items_per_page, | |||
|
94 | **config) | |||
|
95 | if self.aggregations: | |||
|
96 | self.items = self.result.get('aggregations') | |||
|
97 | else: | |||
|
98 | self.items = self.result['hits']['hits'] | |||
|
99 | ||||
|
100 | count = int(self.result['hits']['total']) | |||
|
101 | return count if count < 5000 else 5000 | |||
|
102 | ||||
|
103 | ||||
|
104 | from appenlight.models.resource import Resource | |||
|
105 | from appenlight.models.application import Application | |||
|
106 | from appenlight.models.user import User | |||
|
107 | from appenlight.models.alert_channel import AlertChannel | |||
|
108 | from appenlight.models.alert_channel_action import AlertChannelAction | |||
|
109 | from appenlight.models.request_metric import Metric | |||
|
110 | from appenlight.models.application_postprocess_conf import \ | |||
|
111 | ApplicationPostprocessConf | |||
|
112 | from appenlight.models.auth_token import AuthToken | |||
|
113 | from appenlight.models.event import Event | |||
|
114 | from appenlight.models.external_identity import ExternalIdentity | |||
|
115 | from appenlight.models.group import Group | |||
|
116 | from appenlight.models.group_permission import GroupPermission | |||
|
117 | from appenlight.models.group_resource_permission import GroupResourcePermission | |||
|
118 | from appenlight.models.log import Log | |||
|
119 | from appenlight.models.plugin_config import PluginConfig | |||
|
120 | from appenlight.models.report import Report | |||
|
121 | from appenlight.models.report_group import ReportGroup | |||
|
122 | from appenlight.models.report_comment import ReportComment | |||
|
123 | from appenlight.models.report_assignment import ReportAssignment | |||
|
124 | from appenlight.models.report_stat import ReportStat | |||
|
125 | from appenlight.models.slow_call import SlowCall | |||
|
126 | from appenlight.models.tag import Tag | |||
|
127 | from appenlight.models.user_group import UserGroup | |||
|
128 | from appenlight.models.user_permission import UserPermission | |||
|
129 | from appenlight.models.user_resource_permission import UserResourcePermission | |||
|
130 | from ziggurat_foundations import ziggurat_model_init | |||
|
131 | ||||
|
132 | ziggurat_model_init(User, Group, UserGroup, GroupPermission, UserPermission, | |||
|
133 | UserResourcePermission, GroupResourcePermission, | |||
|
134 | Resource, | |||
|
135 | ExternalIdentity, passwordmanager=None) |
@@ -0,0 +1,296 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | import sqlalchemy as sa | |||
|
24 | import urllib.request, urllib.parse, urllib.error | |||
|
25 | from datetime import timedelta | |||
|
26 | from appenlight.models import Base | |||
|
27 | from appenlight.lib.utils.date_utils import convert_date | |||
|
28 | from sqlalchemy.dialects.postgresql import JSON | |||
|
29 | from ziggurat_foundations.models.base import BaseModel | |||
|
30 | ||||
|
31 | log = logging.getLogger(__name__) | |||
|
32 | ||||
|
33 | # | |||
|
34 | channel_rules_m2m_table = sa.Table( | |||
|
35 | 'channels_actions', Base.metadata, | |||
|
36 | sa.Column('channel_pkey', sa.Integer, | |||
|
37 | sa.ForeignKey('alert_channels.pkey')), | |||
|
38 | sa.Column('action_pkey', sa.Integer, | |||
|
39 | sa.ForeignKey('alert_channels_actions.pkey')) | |||
|
40 | ) | |||
|
41 | ||||
|
42 | DATE_FRMT = '%Y-%m-%dT%H:%M' | |||
|
43 | ||||
|
44 | ||||
|
45 | class AlertChannel(Base, BaseModel): | |||
|
46 | """ | |||
|
47 | Stores information about possible alerting options | |||
|
48 | """ | |||
|
49 | __tablename__ = 'alert_channels' | |||
|
50 | __possible_channel_names__ = ['email'] | |||
|
51 | __mapper_args__ = { | |||
|
52 | 'polymorphic_on': 'channel_name', | |||
|
53 | 'polymorphic_identity': 'integration' | |||
|
54 | } | |||
|
55 | ||||
|
56 | owner_id = sa.Column(sa.Unicode(30), | |||
|
57 | sa.ForeignKey('users.id', onupdate='CASCADE', | |||
|
58 | ondelete='CASCADE')) | |||
|
59 | channel_name = sa.Column(sa.Unicode(25), nullable=False) | |||
|
60 | channel_value = sa.Column(sa.Unicode(80), nullable=False, default='') | |||
|
61 | channel_json_conf = sa.Column(JSON(), nullable=False, default='') | |||
|
62 | channel_validated = sa.Column(sa.Boolean, nullable=False, | |||
|
63 | default=False) | |||
|
64 | send_alerts = sa.Column(sa.Boolean, nullable=False, | |||
|
65 | default=True) | |||
|
66 | daily_digest = sa.Column(sa.Boolean, nullable=False, | |||
|
67 | default=True) | |||
|
68 | integration_id = sa.Column(sa.Integer, sa.ForeignKey('integrations.id'), | |||
|
69 | nullable=True) | |||
|
70 | pkey = sa.Column(sa.Integer(), nullable=False, primary_key=True) | |||
|
71 | ||||
|
72 | channel_actions = sa.orm.relationship('AlertChannelAction', | |||
|
73 | cascade="all", | |||
|
74 | passive_deletes=True, | |||
|
75 | passive_updates=True, | |||
|
76 | secondary=channel_rules_m2m_table, | |||
|
77 | backref='channels') | |||
|
78 | ||||
|
79 | @property | |||
|
80 | def channel_visible_value(self): | |||
|
81 | if self.integration: | |||
|
82 | return '{}: {}'.format( | |||
|
83 | self.channel_name, | |||
|
84 | self.integration.resource.resource_name | |||
|
85 | ) | |||
|
86 | ||||
|
87 | return '{}: {}'.format( | |||
|
88 | self.channel_name, | |||
|
89 | self.channel_value | |||
|
90 | ) | |||
|
91 | ||||
|
92 | def get_dict(self, exclude_keys=None, include_keys=None, | |||
|
93 | extended_info=True): | |||
|
94 | """ | |||
|
95 | Returns dictionary with required information that will be consumed by | |||
|
96 | angular | |||
|
97 | """ | |||
|
98 | instance_dict = super(AlertChannel, self).get_dict(exclude_keys, | |||
|
99 | include_keys) | |||
|
100 | exclude_keys_list = exclude_keys or [] | |||
|
101 | include_keys_list = include_keys or [] | |||
|
102 | ||||
|
103 | instance_dict['supports_report_alerting'] = True | |||
|
104 | instance_dict['channel_visible_value'] = self.channel_visible_value | |||
|
105 | ||||
|
106 | if extended_info: | |||
|
107 | instance_dict['actions'] = [ | |||
|
108 | rule.get_dict(extended_info=True) for | |||
|
109 | rule in self.channel_actions] | |||
|
110 | ||||
|
111 | del instance_dict['channel_json_conf'] | |||
|
112 | ||||
|
113 | if self.integration: | |||
|
114 | instance_dict[ | |||
|
115 | 'supports_report_alerting'] = \ | |||
|
116 | self.integration.supports_report_alerting | |||
|
117 | d = {} | |||
|
118 | for k in instance_dict.keys(): | |||
|
119 | if (k not in exclude_keys_list and | |||
|
120 | (k in include_keys_list or not include_keys)): | |||
|
121 | d[k] = instance_dict[k] | |||
|
122 | return d | |||
|
123 | ||||
|
124 | def __repr__(self): | |||
|
125 | return '<AlertChannel: (%s,%s), user:%s>' % (self.channel_name, | |||
|
126 | self.channel_value, | |||
|
127 | self.user_name,) | |||
|
128 | ||||
|
129 | def send_digest(self, **kwargs): | |||
|
130 | """ | |||
|
131 | This should implement daily top error report notifications | |||
|
132 | """ | |||
|
133 | log.warning('send_digest NOT IMPLEMENTED') | |||
|
134 | ||||
|
135 | def notify_reports(self, **kwargs): | |||
|
136 | """ | |||
|
137 | This should implement notification of reports that occured in 1 min | |||
|
138 | interval | |||
|
139 | """ | |||
|
140 | log.warning('notify_reports NOT IMPLEMENTED') | |||
|
141 | ||||
|
142 | def notify_alert(self, **kwargs): | |||
|
143 | """ | |||
|
144 | Notify user of report/uptime/chart threshold events based on events alert | |||
|
145 | type | |||
|
146 | ||||
|
147 | Kwargs: | |||
|
148 | application: application that the event applies for, | |||
|
149 | event: event that is notified, | |||
|
150 | user: user that should be notified | |||
|
151 | request: request object | |||
|
152 | ||||
|
153 | """ | |||
|
154 | alert_name = kwargs['event'].unified_alert_name() | |||
|
155 | if alert_name in ['slow_report_alert', 'error_report_alert']: | |||
|
156 | self.notify_report_alert(**kwargs) | |||
|
157 | elif alert_name == 'uptime_alert': | |||
|
158 | self.notify_uptime_alert(**kwargs) | |||
|
159 | elif alert_name == 'chart_alert': | |||
|
160 | self.notify_chart_alert(**kwargs) | |||
|
161 | ||||
|
162 | def notify_chart_alert(self, **kwargs): | |||
|
163 | """ | |||
|
164 | This should implement report open/close alerts notifications | |||
|
165 | """ | |||
|
166 | log.warning('notify_chart_alert NOT IMPLEMENTED') | |||
|
167 | ||||
|
168 | def notify_report_alert(self, **kwargs): | |||
|
169 | """ | |||
|
170 | This should implement report open/close alerts notifications | |||
|
171 | """ | |||
|
172 | log.warning('notify_report_alert NOT IMPLEMENTED') | |||
|
173 | ||||
|
174 | def notify_uptime_alert(self, **kwargs): | |||
|
175 | """ | |||
|
176 | This should implement uptime open/close alerts notifications | |||
|
177 | """ | |||
|
178 | log.warning('notify_uptime_alert NOT IMPLEMENTED') | |||
|
179 | ||||
|
180 | def get_notification_basic_vars(self, kwargs): | |||
|
181 | """ | |||
|
182 | Sets most common variables used later for rendering notifications for | |||
|
183 | channel | |||
|
184 | """ | |||
|
185 | if 'event' in kwargs: | |||
|
186 | kwargs['since_when'] = kwargs['event'].start_date | |||
|
187 | ||||
|
188 | url_start_date = kwargs.get('since_when') - timedelta(minutes=1) | |||
|
189 | url_end_date = kwargs.get('since_when') + timedelta(minutes=4) | |||
|
190 | tmpl_vars = { | |||
|
191 | "timestamp": kwargs['since_when'], | |||
|
192 | "user": kwargs['user'], | |||
|
193 | "since_when": kwargs.get('since_when'), | |||
|
194 | "url_start_date": url_start_date, | |||
|
195 | "url_end_date": url_end_date | |||
|
196 | } | |||
|
197 | tmpl_vars["resource_name"] = kwargs['resource'].resource_name | |||
|
198 | tmpl_vars["resource"] = kwargs['resource'] | |||
|
199 | ||||
|
200 | if 'event' in kwargs: | |||
|
201 | tmpl_vars['event_values'] = kwargs['event'].values | |||
|
202 | tmpl_vars['alert_type'] = kwargs['event'].unified_alert_name() | |||
|
203 | tmpl_vars['alert_action'] = kwargs['event'].unified_alert_action() | |||
|
204 | return tmpl_vars | |||
|
205 | ||||
|
206 | def report_alert_notification_vars(self, kwargs): | |||
|
207 | tmpl_vars = self.get_notification_basic_vars(kwargs) | |||
|
208 | reports = kwargs.get('reports', []) | |||
|
209 | tmpl_vars["reports"] = reports | |||
|
210 | tmpl_vars["confirmed_total"] = len(reports) | |||
|
211 | ||||
|
212 | tmpl_vars["report_type"] = "error reports" | |||
|
213 | tmpl_vars["url_report_type"] = 'report' | |||
|
214 | ||||
|
215 | alert_type = tmpl_vars.get('alert_type', '') | |||
|
216 | if 'slow_report' in alert_type: | |||
|
217 | tmpl_vars["report_type"] = "slow reports" | |||
|
218 | tmpl_vars["url_report_type"] = 'report/list_slow' | |||
|
219 | ||||
|
220 | app_url = kwargs['request'].registry.settings['_mail_url'] | |||
|
221 | ||||
|
222 | destination_url = kwargs['request'].route_url('/', | |||
|
223 | _app_url=app_url) | |||
|
224 | if alert_type: | |||
|
225 | destination_url += 'ui/{}?resource={}&start_date={}&end_date={}'.format( | |||
|
226 | tmpl_vars["url_report_type"], | |||
|
227 | tmpl_vars['resource'].resource_id, | |||
|
228 | tmpl_vars['url_start_date'].strftime(DATE_FRMT), | |||
|
229 | tmpl_vars['url_end_date'].strftime(DATE_FRMT) | |||
|
230 | ) | |||
|
231 | else: | |||
|
232 | destination_url += 'ui/{}?resource={}'.format( | |||
|
233 | tmpl_vars["url_report_type"], | |||
|
234 | tmpl_vars['resource'].resource_id | |||
|
235 | ) | |||
|
236 | tmpl_vars["destination_url"] = destination_url | |||
|
237 | ||||
|
238 | return tmpl_vars | |||
|
239 | ||||
|
240 | def uptime_alert_notification_vars(self, kwargs): | |||
|
241 | tmpl_vars = self.get_notification_basic_vars(kwargs) | |||
|
242 | app_url = kwargs['request'].registry.settings['_mail_url'] | |||
|
243 | destination_url = kwargs['request'].route_url('/', _app_url=app_url) | |||
|
244 | destination_url += 'ui/{}?resource={}'.format( | |||
|
245 | 'uptime', | |||
|
246 | tmpl_vars['resource'].resource_id) | |||
|
247 | tmpl_vars['destination_url'] = destination_url | |||
|
248 | ||||
|
249 | reason = '' | |||
|
250 | e_values = tmpl_vars.get('event_values') | |||
|
251 | ||||
|
252 | if e_values and e_values.get('response_time') == 0: | |||
|
253 | reason += ' Response time was slower than 20 seconds.' | |||
|
254 | elif e_values: | |||
|
255 | code = e_values.get('status_code') | |||
|
256 | reason += ' Response status code: %s.' % code | |||
|
257 | ||||
|
258 | tmpl_vars['reason'] = reason | |||
|
259 | return tmpl_vars | |||
|
260 | ||||
|
261 | def chart_alert_notification_vars(self, kwargs): | |||
|
262 | tmpl_vars = self.get_notification_basic_vars(kwargs) | |||
|
263 | tmpl_vars['chart_name'] = tmpl_vars['event_values']['chart_name'] | |||
|
264 | tmpl_vars['action_name'] = tmpl_vars['event_values'].get( | |||
|
265 | 'action_name') or '' | |||
|
266 | matched_values = tmpl_vars['event_values']['matched_step_values'] | |||
|
267 | tmpl_vars['readable_values'] = [] | |||
|
268 | for key, value in list(matched_values['values'].items()): | |||
|
269 | matched_label = matched_values['labels'].get(key) | |||
|
270 | if matched_label: | |||
|
271 | tmpl_vars['readable_values'].append({ | |||
|
272 | 'label': matched_label['human_label'], | |||
|
273 | 'value': value | |||
|
274 | }) | |||
|
275 | tmpl_vars['readable_values'] = sorted(tmpl_vars['readable_values'], | |||
|
276 | key=lambda x: x['label']) | |||
|
277 | start_date = convert_date(tmpl_vars['event_values']['start_interval']) | |||
|
278 | end_date = None | |||
|
279 | if tmpl_vars['event_values'].get('end_interval'): | |||
|
280 | end_date = convert_date(tmpl_vars['event_values']['end_interval']) | |||
|
281 | ||||
|
282 | app_url = kwargs['request'].registry.settings['_mail_url'] | |||
|
283 | destination_url = kwargs['request'].route_url('/', _app_url=app_url) | |||
|
284 | to_encode = { | |||
|
285 | 'resource': tmpl_vars['event_values']['resource'], | |||
|
286 | 'start_date': start_date.strftime(DATE_FRMT), | |||
|
287 | } | |||
|
288 | if end_date: | |||
|
289 | to_encode['end_date'] = end_date.strftime(DATE_FRMT) | |||
|
290 | ||||
|
291 | destination_url += 'ui/{}?{}'.format( | |||
|
292 | 'logs', | |||
|
293 | urllib.parse.urlencode(to_encode) | |||
|
294 | ) | |||
|
295 | tmpl_vars['destination_url'] = destination_url | |||
|
296 | return tmpl_vars |
@@ -0,0 +1,84 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import sqlalchemy as sa | |||
|
23 | ||||
|
24 | from appenlight.models.resource import Resource | |||
|
25 | from appenlight.models import Base, get_db_session | |||
|
26 | from sqlalchemy.orm import validates | |||
|
27 | from ziggurat_foundations.models.base import BaseModel | |||
|
28 | ||||
|
29 | ||||
|
30 | class AlertChannelAction(Base, BaseModel): | |||
|
31 | """ | |||
|
32 | Stores notifications conditions for user's alert channels | |||
|
33 | This is later used for rule parsing like "alert if http_status == 500" | |||
|
34 | """ | |||
|
35 | __tablename__ = 'alert_channels_actions' | |||
|
36 | ||||
|
37 | types = ['report', 'chart'] | |||
|
38 | ||||
|
39 | owner_id = sa.Column(sa.Integer, | |||
|
40 | sa.ForeignKey('users.id', onupdate='CASCADE', | |||
|
41 | ondelete='CASCADE')) | |||
|
42 | resource_id = sa.Column(sa.Integer()) | |||
|
43 | action = sa.Column(sa.Unicode(10), nullable=False, default='always') | |||
|
44 | type = sa.Column(sa.Unicode(10), nullable=False) | |||
|
45 | other_id = sa.Column(sa.Unicode(40)) | |||
|
46 | pkey = sa.Column(sa.Integer(), nullable=False, primary_key=True) | |||
|
47 | rule = sa.Column(sa.dialects.postgresql.JSON, | |||
|
48 | nullable=False, default={'field': 'http_status', | |||
|
49 | "op": "ge", "value": "500"}) | |||
|
50 | config = sa.Column(sa.dialects.postgresql.JSON) | |||
|
51 | name = sa.Column(sa.Unicode(255)) | |||
|
52 | ||||
|
53 | @validates('notify_type') | |||
|
54 | def validate_email(self, key, notify_type): | |||
|
55 | assert notify_type in ['always', 'only_first'] | |||
|
56 | return notify_type | |||
|
57 | ||||
|
58 | def resource_name(self, db_session=None): | |||
|
59 | db_session = get_db_session(db_session) | |||
|
60 | if self.resource_id: | |||
|
61 | return Resource.by_resource_id(self.resource_id, | |||
|
62 | db_session=db_session).resource_name | |||
|
63 | else: | |||
|
64 | return 'any resource' | |||
|
65 | ||||
|
66 | def get_dict(self, exclude_keys=None, include_keys=None, | |||
|
67 | extended_info=False): | |||
|
68 | """ | |||
|
69 | Returns dictionary with required information that will be consumed by | |||
|
70 | angular | |||
|
71 | """ | |||
|
72 | instance_dict = super(AlertChannelAction, self).get_dict() | |||
|
73 | exclude_keys_list = exclude_keys or [] | |||
|
74 | include_keys_list = include_keys or [] | |||
|
75 | if extended_info: | |||
|
76 | instance_dict['channels'] = [ | |||
|
77 | c.get_dict(extended_info=False) for c in self.channels] | |||
|
78 | ||||
|
79 | d = {} | |||
|
80 | for k in instance_dict.keys(): | |||
|
81 | if (k not in exclude_keys_list and | |||
|
82 | (k in include_keys_list or not include_keys)): | |||
|
83 | d[k] = instance_dict[k] | |||
|
84 | return d |
@@ -0,0 +1,21 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 |
@@ -0,0 +1,193 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | from appenlight.models.alert_channel import AlertChannel | |||
|
24 | from appenlight.models.integrations.campfire import CampfireIntegration | |||
|
25 | from webhelpers2.text import truncate | |||
|
26 | ||||
|
27 | log = logging.getLogger(__name__) | |||
|
28 | ||||
|
29 | ||||
|
30 | class CampfireAlertChannel(AlertChannel): | |||
|
31 | __mapper_args__ = { | |||
|
32 | 'polymorphic_identity': 'campfire' | |||
|
33 | } | |||
|
34 | ||||
|
35 | @property | |||
|
36 | def client(self): | |||
|
37 | client = CampfireIntegration.create_client( | |||
|
38 | self.integration.config['api_token'], | |||
|
39 | self.integration.config['account']) | |||
|
40 | return client | |||
|
41 | ||||
|
42 | def notify_reports(self, **kwargs): | |||
|
43 | """ | |||
|
44 | Notify user of individual reports | |||
|
45 | ||||
|
46 | kwargs: | |||
|
47 | application: application that the event applies for, | |||
|
48 | user: user that should be notified | |||
|
49 | request: request object | |||
|
50 | since_when: reports are newer than this time value, | |||
|
51 | reports: list of reports to render | |||
|
52 | ||||
|
53 | """ | |||
|
54 | template_vars = self.report_alert_notification_vars(kwargs) | |||
|
55 | ||||
|
56 | app_url = kwargs['request'].registry.settings['_mail_url'] | |||
|
57 | destination_url = kwargs['request'].route_url('/', | |||
|
58 | app_url=app_url) | |||
|
59 | f_args = ('report', | |||
|
60 | template_vars['resource'].resource_id, | |||
|
61 | template_vars['url_start_date'].strftime('%Y-%m-%dT%H:%M'), | |||
|
62 | template_vars['url_end_date'].strftime('%Y-%m-%dT%H:%M')) | |||
|
63 | destination_url += 'ui/{}?resource={}&start_date={}&end_date={}'.format( | |||
|
64 | *f_args) | |||
|
65 | ||||
|
66 | if template_vars['confirmed_total'] > 1: | |||
|
67 | template_vars["title"] = "%s - %s reports" % ( | |||
|
68 | template_vars['resource_name'], | |||
|
69 | template_vars['confirmed_total'], | |||
|
70 | ) | |||
|
71 | else: | |||
|
72 | error_title = truncate(template_vars['reports'][0][1].error or | |||
|
73 | 'slow report', 90) | |||
|
74 | template_vars["title"] = "%s - '%s' report" % ( | |||
|
75 | template_vars['resource_name'], | |||
|
76 | error_title) | |||
|
77 | ||||
|
78 | template_vars["title"] += ' ' + destination_url | |||
|
79 | ||||
|
80 | log_msg = 'NOTIFY : %s via %s :: %s reports' % ( | |||
|
81 | kwargs['user'].user_name, | |||
|
82 | self.channel_visible_value, | |||
|
83 | template_vars['confirmed_total']) | |||
|
84 | log.warning(log_msg) | |||
|
85 | ||||
|
86 | for room in self.integration.config['rooms'].split(','): | |||
|
87 | self.client.speak_to_room(room.strip(), template_vars["title"]) | |||
|
88 | ||||
|
89 | def notify_report_alert(self, **kwargs): | |||
|
90 | """ | |||
|
91 | Build and send report alert notification | |||
|
92 | ||||
|
93 | Kwargs: | |||
|
94 | application: application that the event applies for, | |||
|
95 | event: event that is notified, | |||
|
96 | user: user that should be notified | |||
|
97 | request: request object | |||
|
98 | ||||
|
99 | """ | |||
|
100 | template_vars = self.report_alert_notification_vars(kwargs) | |||
|
101 | ||||
|
102 | if kwargs['event'].unified_alert_action() == 'OPEN': | |||
|
103 | title = 'ALERT %s: %s - %s %s %s' % ( | |||
|
104 | template_vars['alert_action'], | |||
|
105 | template_vars['resource_name'], | |||
|
106 | kwargs['event'].values['reports'], | |||
|
107 | template_vars['report_type'], | |||
|
108 | template_vars['destination_url'] | |||
|
109 | ) | |||
|
110 | ||||
|
111 | else: | |||
|
112 | title = 'ALERT %s: %s type: %s' % ( | |||
|
113 | template_vars['alert_action'], | |||
|
114 | template_vars['resource_name'], | |||
|
115 | template_vars['alert_type'].replace('_', ' '), | |||
|
116 | ) | |||
|
117 | for room in self.integration.config['rooms'].split(','): | |||
|
118 | self.client.speak_to_room(room.strip(), title, sound='VUVUZELA') | |||
|
119 | ||||
|
120 | def notify_uptime_alert(self, **kwargs): | |||
|
121 | """ | |||
|
122 | Build and send uptime alert notification | |||
|
123 | ||||
|
124 | Kwargs: | |||
|
125 | application: application that the event applies for, | |||
|
126 | event: event that is notified, | |||
|
127 | user: user that should be notified | |||
|
128 | request: request object | |||
|
129 | ||||
|
130 | """ | |||
|
131 | template_vars = self.uptime_alert_notification_vars(kwargs) | |||
|
132 | ||||
|
133 | message = 'ALERT %s: %s has uptime issues %s\n\n' % ( | |||
|
134 | template_vars['alert_action'], | |||
|
135 | template_vars['resource_name'], | |||
|
136 | template_vars['destination_url'] | |||
|
137 | ) | |||
|
138 | message += template_vars['reason'] | |||
|
139 | ||||
|
140 | for room in self.integration.config['rooms'].split(','): | |||
|
141 | self.client.speak_to_room(room.strip(), message, sound='VUVUZELA') | |||
|
142 | ||||
|
143 | def send_digest(self, **kwargs): | |||
|
144 | """ | |||
|
145 | Build and send daily digest notification | |||
|
146 | ||||
|
147 | kwargs: | |||
|
148 | application: application that the event applies for, | |||
|
149 | user: user that should be notified | |||
|
150 | request: request object | |||
|
151 | since_when: reports are newer than this time value, | |||
|
152 | reports: list of reports to render | |||
|
153 | ||||
|
154 | """ | |||
|
155 | template_vars = self.report_alert_notification_vars(kwargs) | |||
|
156 | f_args = (template_vars['resource_name'], | |||
|
157 | template_vars['confirmed_total'],) | |||
|
158 | message = "Daily report digest: %s - %s reports" % f_args | |||
|
159 | message += '{}\n'.format(template_vars['destination_url']) | |||
|
160 | for room in self.integration.config['rooms'].split(','): | |||
|
161 | self.client.speak_to_room(room.strip(), message) | |||
|
162 | ||||
|
163 | log_msg = 'DIGEST : %s via %s :: %s reports' % ( | |||
|
164 | kwargs['user'].user_name, | |||
|
165 | self.channel_visible_value, | |||
|
166 | template_vars['confirmed_total']) | |||
|
167 | log.warning(log_msg) | |||
|
168 | ||||
|
169 | def notify_chart_alert(self, **kwargs): | |||
|
170 | """ | |||
|
171 | Build and send chart alert notification | |||
|
172 | ||||
|
173 | Kwargs: | |||
|
174 | application: application that the event applies for, | |||
|
175 | event: event that is notified, | |||
|
176 | user: user that should be notified | |||
|
177 | request: request object | |||
|
178 | ||||
|
179 | """ | |||
|
180 | template_vars = self.chart_alert_notification_vars(kwargs) | |||
|
181 | message = 'ALERT {}: value in "{}" chart: ' \ | |||
|
182 | 'met alert "{}" criteria {} \n'.format( | |||
|
183 | template_vars['alert_action'], | |||
|
184 | template_vars['chart_name'], | |||
|
185 | template_vars['action_name'], | |||
|
186 | template_vars['destination_url'] | |||
|
187 | ) | |||
|
188 | ||||
|
189 | for item in template_vars['readable_values']: | |||
|
190 | message += '{}: {}\n'.format(item['label'], item['value']) | |||
|
191 | ||||
|
192 | for room in self.integration.config['rooms'].split(','): | |||
|
193 | self.client.speak_to_room(room.strip(), message, sound='VUVUZELA') |
@@ -0,0 +1,180 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | from appenlight.models.alert_channel import AlertChannel | |||
|
24 | from appenlight.models.services.user import UserService | |||
|
25 | from webhelpers2.text import truncate | |||
|
26 | ||||
|
27 | log = logging.getLogger(__name__) | |||
|
28 | ||||
|
29 | ||||
|
30 | class EmailAlertChannel(AlertChannel): | |||
|
31 | """ | |||
|
32 | Default email alerting channel | |||
|
33 | """ | |||
|
34 | ||||
|
35 | __mapper_args__ = { | |||
|
36 | 'polymorphic_identity': 'email' | |||
|
37 | } | |||
|
38 | ||||
|
39 | def notify_reports(self, **kwargs): | |||
|
40 | """ | |||
|
41 | Notify user of individual reports | |||
|
42 | ||||
|
43 | kwargs: | |||
|
44 | application: application that the event applies for, | |||
|
45 | user: user that should be notified | |||
|
46 | request: request object | |||
|
47 | since_when: reports are newer than this time value, | |||
|
48 | reports: list of reports to render | |||
|
49 | ||||
|
50 | """ | |||
|
51 | template_vars = self.report_alert_notification_vars(kwargs) | |||
|
52 | ||||
|
53 | if template_vars['confirmed_total'] > 1: | |||
|
54 | template_vars["title"] = "App Enlight :: %s - %s reports" % ( | |||
|
55 | template_vars['resource_name'], | |||
|
56 | template_vars['confirmed_total'], | |||
|
57 | ) | |||
|
58 | else: | |||
|
59 | error_title = truncate(template_vars['reports'][0][1].error or | |||
|
60 | 'slow report', 20) | |||
|
61 | template_vars["title"] = "App Enlight :: %s - '%s' report" % ( | |||
|
62 | template_vars['resource_name'], | |||
|
63 | error_title) | |||
|
64 | UserService.send_email(kwargs['request'], | |||
|
65 | [self.channel_value], | |||
|
66 | template_vars, | |||
|
67 | '/email_templates/notify_reports.jinja2') | |||
|
68 | log_msg = 'NOTIFY : %s via %s :: %s reports' % ( | |||
|
69 | kwargs['user'].user_name, | |||
|
70 | self.channel_visible_value, | |||
|
71 | template_vars['confirmed_total']) | |||
|
72 | log.warning(log_msg) | |||
|
73 | ||||
|
74 | def send_digest(self, **kwargs): | |||
|
75 | """ | |||
|
76 | Build and send daily digest notification | |||
|
77 | ||||
|
78 | kwargs: | |||
|
79 | application: application that the event applies for, | |||
|
80 | user: user that should be notified | |||
|
81 | request: request object | |||
|
82 | since_when: reports are newer than this time value, | |||
|
83 | reports: list of reports to render | |||
|
84 | ||||
|
85 | """ | |||
|
86 | template_vars = self.report_alert_notification_vars(kwargs) | |||
|
87 | title = "App Enlight :: Daily report digest: %s - %s reports" | |||
|
88 | template_vars["email_title"] = title % ( | |||
|
89 | template_vars['resource_name'], | |||
|
90 | template_vars['confirmed_total'], | |||
|
91 | ) | |||
|
92 | ||||
|
93 | UserService.send_email(kwargs['request'], | |||
|
94 | [self.channel_value], | |||
|
95 | template_vars, | |||
|
96 | '/email_templates/notify_reports.jinja2', | |||
|
97 | immediately=True, | |||
|
98 | silent=True) | |||
|
99 | log_msg = 'DIGEST : %s via %s :: %s reports' % ( | |||
|
100 | kwargs['user'].user_name, | |||
|
101 | self.channel_visible_value, | |||
|
102 | template_vars['confirmed_total']) | |||
|
103 | log.warning(log_msg) | |||
|
104 | ||||
|
105 | def notify_report_alert(self, **kwargs): | |||
|
106 | """ | |||
|
107 | Build and send report alert notification | |||
|
108 | ||||
|
109 | Kwargs: | |||
|
110 | application: application that the event applies for, | |||
|
111 | event: event that is notified, | |||
|
112 | user: user that should be notified | |||
|
113 | request: request object | |||
|
114 | ||||
|
115 | """ | |||
|
116 | template_vars = self.report_alert_notification_vars(kwargs) | |||
|
117 | ||||
|
118 | if kwargs['event'].unified_alert_action() == 'OPEN': | |||
|
119 | title = 'App Enlight :: ALERT %s: %s - %s %s' % ( | |||
|
120 | template_vars['alert_action'], | |||
|
121 | template_vars['resource_name'], | |||
|
122 | kwargs['event'].values['reports'], | |||
|
123 | template_vars['report_type'], | |||
|
124 | ) | |||
|
125 | else: | |||
|
126 | title = 'App Enlight :: ALERT %s: %s type: %s' % ( | |||
|
127 | template_vars['alert_action'], | |||
|
128 | template_vars['resource_name'], | |||
|
129 | template_vars['alert_type'].replace('_', ' '), | |||
|
130 | ) | |||
|
131 | template_vars['email_title'] = title | |||
|
132 | UserService.send_email(kwargs['request'], [self.channel_value], | |||
|
133 | template_vars, | |||
|
134 | '/email_templates/alert_reports.jinja2') | |||
|
135 | ||||
|
136 | def notify_uptime_alert(self, **kwargs): | |||
|
137 | """ | |||
|
138 | Build and send uptime alert notification | |||
|
139 | ||||
|
140 | Kwargs: | |||
|
141 | application: application that the event applies for, | |||
|
142 | event: event that is notified, | |||
|
143 | user: user that should be notified | |||
|
144 | request: request object | |||
|
145 | ||||
|
146 | """ | |||
|
147 | template_vars = self.uptime_alert_notification_vars(kwargs) | |||
|
148 | title = 'App Enlight :: ALERT %s: %s has uptime issues' % ( | |||
|
149 | template_vars['alert_action'], | |||
|
150 | template_vars['resource_name'], | |||
|
151 | ) | |||
|
152 | template_vars['email_title'] = title | |||
|
153 | ||||
|
154 | UserService.send_email(kwargs['request'], [self.channel_value], | |||
|
155 | template_vars, | |||
|
156 | '/email_templates/alert_uptime.jinja2') | |||
|
157 | ||||
|
158 | def notify_chart_alert(self, **kwargs): | |||
|
159 | """ | |||
|
160 | Build and send chart alert notification | |||
|
161 | ||||
|
162 | Kwargs: | |||
|
163 | application: application that the event applies for, | |||
|
164 | event: event that is notified, | |||
|
165 | user: user that should be notified | |||
|
166 | request: request object | |||
|
167 | ||||
|
168 | """ | |||
|
169 | template_vars = self.chart_alert_notification_vars(kwargs) | |||
|
170 | ||||
|
171 | title = 'App Enlight :: ALERT {} value in "{}" chart' \ | |||
|
172 | ' met alert "{}" criteria'.format( | |||
|
173 | template_vars['alert_action'], | |||
|
174 | template_vars['chart_name'], | |||
|
175 | template_vars['action_name'], | |||
|
176 | ) | |||
|
177 | template_vars['email_title'] = title | |||
|
178 | UserService.send_email(kwargs['request'], [self.channel_value], | |||
|
179 | template_vars, | |||
|
180 | '/email_templates/alert_chart.jinja2') |
@@ -0,0 +1,238 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | from appenlight.models.alert_channel import AlertChannel | |||
|
24 | from appenlight.models.integrations.flowdock import FlowdockIntegration | |||
|
25 | from webhelpers2.text import truncate | |||
|
26 | ||||
|
27 | log = logging.getLogger(__name__) | |||
|
28 | ||||
|
29 | ||||
|
30 | class FlowdockAlertChannel(AlertChannel): | |||
|
31 | __mapper_args__ = { | |||
|
32 | 'polymorphic_identity': 'flowdock' | |||
|
33 | } | |||
|
34 | ||||
|
35 | def notify_reports(self, **kwargs): | |||
|
36 | """ | |||
|
37 | Notify user of individual reports | |||
|
38 | ||||
|
39 | kwargs: | |||
|
40 | application: application that the event applies for, | |||
|
41 | user: user that should be notified | |||
|
42 | request: request object | |||
|
43 | since_when: reports are newer than this time value, | |||
|
44 | reports: list of reports to render | |||
|
45 | ||||
|
46 | """ | |||
|
47 | template_vars = self.report_alert_notification_vars(kwargs) | |||
|
48 | ||||
|
49 | app_url = kwargs['request'].registry.settings['_mail_url'] | |||
|
50 | destination_url = kwargs['request'].route_url('/', | |||
|
51 | _app_url=app_url) | |||
|
52 | f_args = ('report', | |||
|
53 | template_vars['resource'].resource_id, | |||
|
54 | template_vars['url_start_date'].strftime('%Y-%m-%dT%H:%M'), | |||
|
55 | template_vars['url_end_date'].strftime('%Y-%m-%dT%H:%M')) | |||
|
56 | destination_url += 'ui/{}?resource={}&start_date={}&end_date={}'.format( | |||
|
57 | *f_args) | |||
|
58 | ||||
|
59 | if template_vars['confirmed_total'] > 1: | |||
|
60 | template_vars["title"] = "%s - %s reports" % ( | |||
|
61 | template_vars['resource_name'], | |||
|
62 | template_vars['confirmed_total'], | |||
|
63 | ) | |||
|
64 | else: | |||
|
65 | error_title = truncate(template_vars['reports'][0][1].error or | |||
|
66 | 'slow report', 90) | |||
|
67 | template_vars["title"] = "%s - '%s' report" % ( | |||
|
68 | template_vars['resource_name'], | |||
|
69 | error_title) | |||
|
70 | ||||
|
71 | log_msg = 'NOTIFY : %s via %s :: %s reports' % ( | |||
|
72 | kwargs['user'].user_name, | |||
|
73 | self.channel_visible_value, | |||
|
74 | template_vars['confirmed_total']) | |||
|
75 | log.warning(log_msg) | |||
|
76 | ||||
|
77 | client = FlowdockIntegration.create_client( | |||
|
78 | self.integration.config['api_token']) | |||
|
79 | payload = { | |||
|
80 | "source": "App Enlight", | |||
|
81 | "from_address": kwargs['request'].registry.settings[ | |||
|
82 | 'mailing.from_email'], | |||
|
83 | "subject": template_vars["title"], | |||
|
84 | "content": "New report present", | |||
|
85 | "tags": ["appenlight"], | |||
|
86 | "link": destination_url | |||
|
87 | } | |||
|
88 | client.send_to_inbox(payload) | |||
|
89 | ||||
|
90 | def notify_report_alert(self, **kwargs): | |||
|
91 | """ | |||
|
92 | Build and send report alert notification | |||
|
93 | ||||
|
94 | Kwargs: | |||
|
95 | application: application that the event applies for, | |||
|
96 | event: event that is notified, | |||
|
97 | user: user that should be notified | |||
|
98 | request: request object | |||
|
99 | ||||
|
100 | """ | |||
|
101 | template_vars = self.report_alert_notification_vars(kwargs) | |||
|
102 | ||||
|
103 | if kwargs['event'].unified_alert_action() == 'OPEN': | |||
|
104 | ||||
|
105 | title = 'ALERT %s: %s - %s %s' % ( | |||
|
106 | template_vars['alert_action'], | |||
|
107 | template_vars['resource_name'], | |||
|
108 | kwargs['event'].values['reports'], | |||
|
109 | template_vars['report_type'], | |||
|
110 | ) | |||
|
111 | ||||
|
112 | else: | |||
|
113 | title = 'ALERT %s: %s type: %s' % ( | |||
|
114 | template_vars['alert_action'], | |||
|
115 | template_vars['resource_name'], | |||
|
116 | template_vars['alert_type'].replace('_', ' '), | |||
|
117 | ) | |||
|
118 | ||||
|
119 | client = FlowdockIntegration.create_client( | |||
|
120 | self.integration.config['api_token']) | |||
|
121 | payload = { | |||
|
122 | "source": "App Enlight", | |||
|
123 | "from_address": kwargs['request'].registry.settings[ | |||
|
124 | 'mailing.from_email'], | |||
|
125 | "subject": title, | |||
|
126 | "content": 'Investigation required', | |||
|
127 | "tags": ["appenlight", "alert", template_vars['alert_type']], | |||
|
128 | "link": template_vars['destination_url'] | |||
|
129 | } | |||
|
130 | client.send_to_inbox(payload) | |||
|
131 | ||||
|
132 | def notify_uptime_alert(self, **kwargs): | |||
|
133 | """ | |||
|
134 | Build and send uptime alert notification | |||
|
135 | ||||
|
136 | Kwargs: | |||
|
137 | application: application that the event applies for, | |||
|
138 | event: event that is notified, | |||
|
139 | user: user that should be notified | |||
|
140 | request: request object | |||
|
141 | ||||
|
142 | """ | |||
|
143 | template_vars = self.uptime_alert_notification_vars(kwargs) | |||
|
144 | ||||
|
145 | message = 'ALERT %s: %s has uptime issues' % ( | |||
|
146 | template_vars['alert_action'], | |||
|
147 | template_vars['resource_name'], | |||
|
148 | ) | |||
|
149 | submessage = 'Info: ' | |||
|
150 | submessage += template_vars['reason'] | |||
|
151 | ||||
|
152 | client = FlowdockIntegration.create_client( | |||
|
153 | self.integration.config['api_token']) | |||
|
154 | payload = { | |||
|
155 | "source": "App Enlight", | |||
|
156 | "from_address": kwargs['request'].registry.settings[ | |||
|
157 | 'mailing.from_email'], | |||
|
158 | "subject": message, | |||
|
159 | "content": submessage, | |||
|
160 | "tags": ["appenlight", "alert", 'uptime'], | |||
|
161 | "link": template_vars['destination_url'] | |||
|
162 | } | |||
|
163 | client.send_to_inbox(payload) | |||
|
164 | ||||
|
165 | def send_digest(self, **kwargs): | |||
|
166 | """ | |||
|
167 | Build and send daily digest notification | |||
|
168 | ||||
|
169 | kwargs: | |||
|
170 | application: application that the event applies for, | |||
|
171 | user: user that should be notified | |||
|
172 | request: request object | |||
|
173 | since_when: reports are newer than this time value, | |||
|
174 | reports: list of reports to render | |||
|
175 | ||||
|
176 | """ | |||
|
177 | template_vars = self.report_alert_notification_vars(kwargs) | |||
|
178 | message = "Daily report digest: %s - %s reports" % ( | |||
|
179 | template_vars['resource_name'], template_vars['confirmed_total']) | |||
|
180 | ||||
|
181 | f_args = (template_vars['confirmed_total'], | |||
|
182 | template_vars['timestamp']) | |||
|
183 | ||||
|
184 | payload = { | |||
|
185 | "source": "App Enlight", | |||
|
186 | "from_address": kwargs['request'].registry.settings[ | |||
|
187 | 'mailing.from_email'], | |||
|
188 | "subject": message, | |||
|
189 | "content": '%s reports in total since %s' % f_args, | |||
|
190 | "tags": ["appenlight", "digest"], | |||
|
191 | "link": template_vars['destination_url'] | |||
|
192 | } | |||
|
193 | ||||
|
194 | client = FlowdockIntegration.create_client( | |||
|
195 | self.integration.config['api_token']) | |||
|
196 | client.send_to_inbox(payload) | |||
|
197 | ||||
|
198 | log_msg = 'DIGEST : %s via %s :: %s reports' % ( | |||
|
199 | kwargs['user'].user_name, | |||
|
200 | self.channel_visible_value, | |||
|
201 | template_vars['confirmed_total']) | |||
|
202 | log.warning(log_msg) | |||
|
203 | ||||
|
204 | def notify_chart_alert(self, **kwargs): | |||
|
205 | """ | |||
|
206 | Build and send chart alert notification | |||
|
207 | ||||
|
208 | Kwargs: | |||
|
209 | application: application that the event applies for, | |||
|
210 | event: event that is notified, | |||
|
211 | user: user that should be notified | |||
|
212 | request: request object | |||
|
213 | ||||
|
214 | """ | |||
|
215 | template_vars = self.chart_alert_notification_vars(kwargs) | |||
|
216 | ||||
|
217 | message = 'ALERT {}: value in "{}" chart ' \ | |||
|
218 | 'met alert "{}" criteria'.format( | |||
|
219 | template_vars['alert_action'], | |||
|
220 | template_vars['chart_name'], | |||
|
221 | template_vars['action_name'], | |||
|
222 | ) | |||
|
223 | submessage = 'Info: ' | |||
|
224 | for item in template_vars['readable_values']: | |||
|
225 | submessage += '{}: {}\n'.format(item['label'], item['value']) | |||
|
226 | ||||
|
227 | client = FlowdockIntegration.create_client( | |||
|
228 | self.integration.config['api_token']) | |||
|
229 | payload = { | |||
|
230 | "source": "App Enlight", | |||
|
231 | "from_address": kwargs['request'].registry.settings[ | |||
|
232 | 'mailing.from_email'], | |||
|
233 | "subject": message, | |||
|
234 | "content": submessage, | |||
|
235 | "tags": ["appenlight", "alert", 'chart'], | |||
|
236 | "link": template_vars['destination_url'] | |||
|
237 | } | |||
|
238 | client.send_to_inbox(payload) |
@@ -0,0 +1,234 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | from appenlight.models.alert_channel import AlertChannel | |||
|
24 | from appenlight.models.integrations.hipchat import HipchatIntegration | |||
|
25 | from webhelpers2.text import truncate | |||
|
26 | ||||
|
27 | log = logging.getLogger(__name__) | |||
|
28 | ||||
|
29 | ||||
|
30 | class HipchatAlertChannel(AlertChannel): | |||
|
31 | __mapper_args__ = { | |||
|
32 | 'polymorphic_identity': 'hipchat' | |||
|
33 | } | |||
|
34 | ||||
|
35 | def notify_reports(self, **kwargs): | |||
|
36 | """ | |||
|
37 | Notify user of individual reports | |||
|
38 | ||||
|
39 | kwargs: | |||
|
40 | application: application that the event applies for, | |||
|
41 | user: user that should be notified | |||
|
42 | request: request object | |||
|
43 | since_when: reports are newer than this time value, | |||
|
44 | reports: list of reports to render | |||
|
45 | ||||
|
46 | """ | |||
|
47 | template_vars = self.report_alert_notification_vars(kwargs) | |||
|
48 | ||||
|
49 | app_url = kwargs['request'].registry.settings['_mail_url'] | |||
|
50 | destination_url = kwargs['request'].route_url('/', | |||
|
51 | _app_url=app_url) | |||
|
52 | f_args = ('report', | |||
|
53 | template_vars['resource'].resource_id, | |||
|
54 | template_vars['url_start_date'].strftime('%Y-%m-%dT%H:%M'), | |||
|
55 | template_vars['url_end_date'].strftime('%Y-%m-%dT%H:%M')) | |||
|
56 | destination_url += 'ui/{}?resource={}&start_date={}&end_date={}'.format( | |||
|
57 | *f_args) | |||
|
58 | ||||
|
59 | if template_vars['confirmed_total'] > 1: | |||
|
60 | template_vars["title"] = "%s - %s reports" % ( | |||
|
61 | template_vars['resource_name'], | |||
|
62 | template_vars['confirmed_total'], | |||
|
63 | ) | |||
|
64 | else: | |||
|
65 | error_title = truncate(template_vars['reports'][0][1].error or | |||
|
66 | 'slow report', 90) | |||
|
67 | template_vars["title"] = "%s - '%s' report" % ( | |||
|
68 | template_vars['resource_name'], | |||
|
69 | error_title) | |||
|
70 | ||||
|
71 | template_vars["title"] += ' ' + destination_url | |||
|
72 | ||||
|
73 | log_msg = 'NOTIFY : %s via %s :: %s reports' % ( | |||
|
74 | kwargs['user'].user_name, | |||
|
75 | self.channel_visible_value, | |||
|
76 | template_vars['confirmed_total']) | |||
|
77 | log.warning(log_msg) | |||
|
78 | ||||
|
79 | client = HipchatIntegration.create_client( | |||
|
80 | self.integration.config['api_token']) | |||
|
81 | for room in self.integration.config['rooms'].split(','): | |||
|
82 | client.send({ | |||
|
83 | "message_format": 'text', | |||
|
84 | "message": template_vars["title"], | |||
|
85 | "from": "App Enlight", | |||
|
86 | "room_id": room.strip(), | |||
|
87 | "color": "yellow" | |||
|
88 | }) | |||
|
89 | ||||
|
90 | def notify_report_alert(self, **kwargs): | |||
|
91 | """ | |||
|
92 | Build and send report alert notification | |||
|
93 | ||||
|
94 | Kwargs: | |||
|
95 | application: application that the event applies for, | |||
|
96 | event: event that is notified, | |||
|
97 | user: user that should be notified | |||
|
98 | request: request object | |||
|
99 | ||||
|
100 | """ | |||
|
101 | template_vars = self.report_alert_notification_vars(kwargs) | |||
|
102 | ||||
|
103 | if kwargs['event'].unified_alert_action() == 'OPEN': | |||
|
104 | ||||
|
105 | title = 'ALERT %s: %s - %s %s' % ( | |||
|
106 | template_vars['alert_action'], | |||
|
107 | template_vars['resource_name'], | |||
|
108 | kwargs['event'].values['reports'], | |||
|
109 | template_vars['report_type'], | |||
|
110 | ) | |||
|
111 | ||||
|
112 | else: | |||
|
113 | title = 'ALERT %s: %s type: %s' % ( | |||
|
114 | template_vars['alert_action'], | |||
|
115 | template_vars['resource_name'], | |||
|
116 | template_vars['alert_type'].replace('_', ' '), | |||
|
117 | ) | |||
|
118 | ||||
|
119 | title += '\n ' + template_vars['destination_url'] | |||
|
120 | ||||
|
121 | api_token = self.integration.config['api_token'] | |||
|
122 | client = HipchatIntegration.create_client(api_token) | |||
|
123 | for room in self.integration.config['rooms'].split(','): | |||
|
124 | client.send({ | |||
|
125 | "message_format": 'text', | |||
|
126 | "message": title, | |||
|
127 | "from": "App Enlight", | |||
|
128 | "room_id": room.strip(), | |||
|
129 | "color": "red", | |||
|
130 | "notify": '1' | |||
|
131 | }) | |||
|
132 | ||||
|
133 | def notify_uptime_alert(self, **kwargs): | |||
|
134 | """ | |||
|
135 | Build and send uptime alert notification | |||
|
136 | ||||
|
137 | Kwargs: | |||
|
138 | application: application that the event applies for, | |||
|
139 | event: event that is notified, | |||
|
140 | user: user that should be notified | |||
|
141 | request: request object | |||
|
142 | ||||
|
143 | """ | |||
|
144 | template_vars = self.uptime_alert_notification_vars(kwargs) | |||
|
145 | ||||
|
146 | message = 'ALERT %s: %s has uptime issues\n' % ( | |||
|
147 | template_vars['alert_action'], | |||
|
148 | template_vars['resource_name'], | |||
|
149 | ) | |||
|
150 | message += template_vars['reason'] | |||
|
151 | message += '\n{}'.format(template_vars['destination_url']) | |||
|
152 | ||||
|
153 | api_token = self.integration.config['api_token'] | |||
|
154 | client = HipchatIntegration.create_client(api_token) | |||
|
155 | for room in self.integration.config['rooms'].split(','): | |||
|
156 | client.send({ | |||
|
157 | "message_format": 'text', | |||
|
158 | "message": message, | |||
|
159 | "from": "App Enlight", | |||
|
160 | "room_id": room.strip(), | |||
|
161 | "color": "red", | |||
|
162 | "notify": '1' | |||
|
163 | }) | |||
|
164 | ||||
|
165 | def notify_chart_alert(self, **kwargs): | |||
|
166 | """ | |||
|
167 | Build and send chart alert notification | |||
|
168 | ||||
|
169 | Kwargs: | |||
|
170 | application: application that the event applies for, | |||
|
171 | event: event that is notified, | |||
|
172 | user: user that should be notified | |||
|
173 | request: request object | |||
|
174 | ||||
|
175 | """ | |||
|
176 | template_vars = self.chart_alert_notification_vars(kwargs) | |||
|
177 | message = 'ALERT {}: value in "{}" chart: ' \ | |||
|
178 | 'met alert "{}" criteria\n'.format( | |||
|
179 | template_vars['alert_action'], | |||
|
180 | template_vars['chart_name'], | |||
|
181 | template_vars['action_name'], | |||
|
182 | ) | |||
|
183 | ||||
|
184 | for item in template_vars['readable_values']: | |||
|
185 | message += '{}: {}\n'.format(item['label'], item['value']) | |||
|
186 | ||||
|
187 | message += template_vars['destination_url'] | |||
|
188 | ||||
|
189 | api_token = self.integration.config['api_token'] | |||
|
190 | client = HipchatIntegration.create_client(api_token) | |||
|
191 | for room in self.integration.config['rooms'].split(','): | |||
|
192 | client.send({ | |||
|
193 | "message_format": 'text', | |||
|
194 | "message": message, | |||
|
195 | "from": "App Enlight", | |||
|
196 | "room_id": room.strip(), | |||
|
197 | "color": "red", | |||
|
198 | "notify": '1' | |||
|
199 | }) | |||
|
200 | ||||
|
201 | def send_digest(self, **kwargs): | |||
|
202 | """ | |||
|
203 | Build and send daily digest notification | |||
|
204 | ||||
|
205 | kwargs: | |||
|
206 | application: application that the event applies for, | |||
|
207 | user: user that should be notified | |||
|
208 | request: request object | |||
|
209 | since_when: reports are newer than this time value, | |||
|
210 | reports: list of reports to render | |||
|
211 | ||||
|
212 | """ | |||
|
213 | template_vars = self.report_alert_notification_vars(kwargs) | |||
|
214 | f_args = (template_vars['resource_name'], | |||
|
215 | template_vars['confirmed_total'],) | |||
|
216 | message = "Daily report digest: %s - %s reports" % f_args | |||
|
217 | message += '\n{}'.format(template_vars['destination_url']) | |||
|
218 | api_token = self.integration.config['api_token'] | |||
|
219 | client = HipchatIntegration.create_client(api_token) | |||
|
220 | for room in self.integration.config['rooms'].split(','): | |||
|
221 | client.send({ | |||
|
222 | "message_format": 'text', | |||
|
223 | "message": message, | |||
|
224 | "from": "App Enlight", | |||
|
225 | "room_id": room.strip(), | |||
|
226 | "color": "green", | |||
|
227 | "notify": '1' | |||
|
228 | }) | |||
|
229 | ||||
|
230 | log_msg = 'DIGEST : %s via %s :: %s reports' % ( | |||
|
231 | kwargs['user'].user_name, | |||
|
232 | self.channel_visible_value, | |||
|
233 | template_vars['confirmed_total']) | |||
|
234 | log.warning(log_msg) |
@@ -0,0 +1,290 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | from appenlight.models.alert_channel import AlertChannel | |||
|
24 | from appenlight.models.integrations.slack import SlackIntegration | |||
|
25 | from webhelpers2.text import truncate | |||
|
26 | ||||
|
27 | log = logging.getLogger(__name__) | |||
|
28 | ||||
|
29 | ||||
|
30 | class SlackAlertChannel(AlertChannel): | |||
|
31 | __mapper_args__ = { | |||
|
32 | 'polymorphic_identity': 'slack' | |||
|
33 | } | |||
|
34 | ||||
|
35 | def notify_reports(self, **kwargs): | |||
|
36 | """ | |||
|
37 | Notify user of individual reports | |||
|
38 | ||||
|
39 | kwargs: | |||
|
40 | application: application that the event applies for, | |||
|
41 | user: user that should be notified | |||
|
42 | request: request object | |||
|
43 | since_when: reports are newer than this time value, | |||
|
44 | reports: list of reports to render | |||
|
45 | ||||
|
46 | """ | |||
|
47 | template_vars = self.report_alert_notification_vars(kwargs) | |||
|
48 | template_vars["title"] = template_vars['resource_name'] | |||
|
49 | ||||
|
50 | if template_vars['confirmed_total'] > 1: | |||
|
51 | template_vars['subtext'] = '%s reports' % template_vars[ | |||
|
52 | 'confirmed_total'] | |||
|
53 | else: | |||
|
54 | error_title = truncate(template_vars['reports'][0][1].error or | |||
|
55 | 'slow report', 90) | |||
|
56 | template_vars['subtext'] = error_title | |||
|
57 | ||||
|
58 | log_msg = 'NOTIFY : %s via %s :: %s reports' % ( | |||
|
59 | kwargs['user'].user_name, | |||
|
60 | self.channel_visible_value, | |||
|
61 | template_vars['confirmed_total']) | |||
|
62 | log.warning(log_msg) | |||
|
63 | ||||
|
64 | client = SlackIntegration.create_client( | |||
|
65 | self.integration.config['webhook_url']) | |||
|
66 | report_data = { | |||
|
67 | "username": "App Enlight", | |||
|
68 | "icon_emoji": ":fire:", | |||
|
69 | "attachments": [ | |||
|
70 | { | |||
|
71 | "mrkdwn_in": ["text", "pretext", "title", "fallback"], | |||
|
72 | "fallback": "*%s* - <%s| Browse>" % ( | |||
|
73 | template_vars["title"], | |||
|
74 | template_vars['destination_url']), | |||
|
75 | "pretext": "*%s* - <%s| Browse>" % ( | |||
|
76 | template_vars["title"], | |||
|
77 | template_vars['destination_url']), | |||
|
78 | "color": "warning", | |||
|
79 | "fields": [ | |||
|
80 | { | |||
|
81 | "value": 'Info: %s' % template_vars['subtext'], | |||
|
82 | "short": False | |||
|
83 | } | |||
|
84 | ] | |||
|
85 | } | |||
|
86 | ] | |||
|
87 | } | |||
|
88 | client.make_request(data=report_data) | |||
|
89 | ||||
|
90 | def notify_report_alert(self, **kwargs): | |||
|
91 | """ | |||
|
92 | Build and send report alert notification | |||
|
93 | ||||
|
94 | Kwargs: | |||
|
95 | application: application that the event applies for, | |||
|
96 | event: event that is notified, | |||
|
97 | user: user that should be notified | |||
|
98 | request: request object | |||
|
99 | ||||
|
100 | """ | |||
|
101 | template_vars = self.report_alert_notification_vars(kwargs) | |||
|
102 | ||||
|
103 | if kwargs['event'].unified_alert_action() == 'OPEN': | |||
|
104 | title = '*ALERT %s*: %s' % ( | |||
|
105 | template_vars['alert_action'], | |||
|
106 | template_vars['resource_name'] | |||
|
107 | ) | |||
|
108 | ||||
|
109 | template_vars['subtext'] = 'Got at least %s %s' % ( | |||
|
110 | kwargs['event'].values['reports'], | |||
|
111 | template_vars['report_type'] | |||
|
112 | ) | |||
|
113 | ||||
|
114 | else: | |||
|
115 | title = '*ALERT %s*: %s' % ( | |||
|
116 | template_vars['alert_action'], | |||
|
117 | template_vars['resource_name'], | |||
|
118 | ) | |||
|
119 | ||||
|
120 | template_vars['subtext'] = '' | |||
|
121 | ||||
|
122 | alert_type = template_vars['alert_type'].replace('_', ' ') | |||
|
123 | alert_type = alert_type.replace('alert', '').capitalize() | |||
|
124 | ||||
|
125 | template_vars['type'] = "Type: %s" % alert_type | |||
|
126 | ||||
|
127 | client = SlackIntegration.create_client( | |||
|
128 | self.integration.config['webhook_url'] | |||
|
129 | ) | |||
|
130 | report_data = { | |||
|
131 | "username": "App Enlight", | |||
|
132 | "icon_emoji": ":rage:", | |||
|
133 | "attachments": [ | |||
|
134 | { | |||
|
135 | "mrkdwn_in": ["text", "pretext", "title", "fallback"], | |||
|
136 | "fallback": "%s - <%s| Browse>" % ( | |||
|
137 | title, template_vars['destination_url']), | |||
|
138 | "pretext": "%s - <%s| Browse>" % ( | |||
|
139 | title, template_vars['destination_url']), | |||
|
140 | "color": "danger", | |||
|
141 | "fields": [ | |||
|
142 | { | |||
|
143 | "title": template_vars['type'], | |||
|
144 | "value": template_vars['subtext'], | |||
|
145 | "short": False | |||
|
146 | } | |||
|
147 | ] | |||
|
148 | } | |||
|
149 | ] | |||
|
150 | } | |||
|
151 | client.make_request(data=report_data) | |||
|
152 | ||||
|
153 | def notify_uptime_alert(self, **kwargs): | |||
|
154 | """ | |||
|
155 | Build and send uptime alert notification | |||
|
156 | ||||
|
157 | Kwargs: | |||
|
158 | application: application that the event applies for, | |||
|
159 | event: event that is notified, | |||
|
160 | user: user that should be notified | |||
|
161 | request: request object | |||
|
162 | ||||
|
163 | """ | |||
|
164 | template_vars = self.uptime_alert_notification_vars(kwargs) | |||
|
165 | ||||
|
166 | title = '*ALERT %s*: %s' % ( | |||
|
167 | template_vars['alert_action'], | |||
|
168 | template_vars['resource_name'], | |||
|
169 | ) | |||
|
170 | client = SlackIntegration.create_client( | |||
|
171 | self.integration.config['webhook_url'] | |||
|
172 | ) | |||
|
173 | report_data = { | |||
|
174 | "username": "App Enlight", | |||
|
175 | "icon_emoji": ":rage:", | |||
|
176 | "attachments": [ | |||
|
177 | { | |||
|
178 | "mrkdwn_in": ["text", "pretext", "title", "fallback"], | |||
|
179 | "fallback": "{} - <{}| Browse>".format( | |||
|
180 | title, template_vars['destination_url']), | |||
|
181 | "pretext": "{} - <{}| Browse>".format( | |||
|
182 | title, template_vars['destination_url']), | |||
|
183 | "color": "danger", | |||
|
184 | "fields": [ | |||
|
185 | { | |||
|
186 | "title": "Application has uptime issues", | |||
|
187 | "value": template_vars['reason'], | |||
|
188 | "short": False | |||
|
189 | } | |||
|
190 | ] | |||
|
191 | } | |||
|
192 | ] | |||
|
193 | } | |||
|
194 | client.make_request(data=report_data) | |||
|
195 | ||||
|
196 | def notify_chart_alert(self, **kwargs): | |||
|
197 | """ | |||
|
198 | Build and send chart alert notification | |||
|
199 | ||||
|
200 | Kwargs: | |||
|
201 | application: application that the event applies for, | |||
|
202 | event: event that is notified, | |||
|
203 | user: user that should be notified | |||
|
204 | request: request object | |||
|
205 | ||||
|
206 | """ | |||
|
207 | template_vars = self.chart_alert_notification_vars(kwargs) | |||
|
208 | ||||
|
209 | title = '*ALERT {}*: value in *"{}"* chart ' \ | |||
|
210 | 'met alert *"{}"* criteria'.format( | |||
|
211 | template_vars['alert_action'], | |||
|
212 | template_vars['chart_name'], | |||
|
213 | template_vars['action_name'], | |||
|
214 | ) | |||
|
215 | ||||
|
216 | subtext = '' | |||
|
217 | for item in template_vars['readable_values']: | |||
|
218 | subtext += '{} - {}\n'.format(item['label'], item['value']) | |||
|
219 | ||||
|
220 | client = SlackIntegration.create_client( | |||
|
221 | self.integration.config['webhook_url'] | |||
|
222 | ) | |||
|
223 | report_data = { | |||
|
224 | "username": "App Enlight", | |||
|
225 | "icon_emoji": ":rage:", | |||
|
226 | "attachments": [ | |||
|
227 | {"mrkdwn_in": ["text", "pretext", "title", "fallback"], | |||
|
228 | "fallback": "{} - <{}| Browse>".format( | |||
|
229 | title, template_vars['destination_url']), | |||
|
230 | "pretext": "{} - <{}| Browse>".format( | |||
|
231 | title, template_vars['destination_url']), | |||
|
232 | "color": "danger", | |||
|
233 | "fields": [ | |||
|
234 | { | |||
|
235 | "title": "Following criteria were met:", | |||
|
236 | "value": subtext, | |||
|
237 | "short": False | |||
|
238 | } | |||
|
239 | ] | |||
|
240 | } | |||
|
241 | ] | |||
|
242 | } | |||
|
243 | client.make_request(data=report_data) | |||
|
244 | ||||
|
245 | def send_digest(self, **kwargs): | |||
|
246 | """ | |||
|
247 | Build and send daily digest notification | |||
|
248 | ||||
|
249 | kwargs: | |||
|
250 | application: application that the event applies for, | |||
|
251 | user: user that should be notified | |||
|
252 | request: request object | |||
|
253 | since_when: reports are newer than this time value, | |||
|
254 | reports: list of reports to render | |||
|
255 | ||||
|
256 | """ | |||
|
257 | template_vars = self.report_alert_notification_vars(kwargs) | |||
|
258 | title = "*Daily report digest*: %s" % template_vars['resource_name'] | |||
|
259 | ||||
|
260 | subtext = '%s reports' % template_vars['confirmed_total'] | |||
|
261 | ||||
|
262 | client = SlackIntegration.create_client( | |||
|
263 | self.integration.config['webhook_url'] | |||
|
264 | ) | |||
|
265 | report_data = { | |||
|
266 | "username": "App Enlight", | |||
|
267 | "attachments": [ | |||
|
268 | { | |||
|
269 | "mrkdwn_in": ["text", "pretext", "title", "fallback"], | |||
|
270 | "fallback": "%s : <%s| Browse>" % ( | |||
|
271 | title, template_vars['destination_url']), | |||
|
272 | "pretext": "%s: <%s| Browse>" % ( | |||
|
273 | title, template_vars['destination_url']), | |||
|
274 | "color": "good", | |||
|
275 | "fields": [ | |||
|
276 | { | |||
|
277 | "title": "Got at least: %s" % subtext, | |||
|
278 | "short": False | |||
|
279 | } | |||
|
280 | ] | |||
|
281 | } | |||
|
282 | ] | |||
|
283 | } | |||
|
284 | client.make_request(data=report_data) | |||
|
285 | ||||
|
286 | log_msg = 'DIGEST : %s via %s :: %s reports' % ( | |||
|
287 | kwargs['user'].user_name, | |||
|
288 | self.channel_visible_value, | |||
|
289 | template_vars['confirmed_total']) | |||
|
290 | log.warning(log_msg) |
@@ -0,0 +1,109 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import uuid | |||
|
23 | import logging | |||
|
24 | import sqlalchemy as sa | |||
|
25 | from appenlight.models.resource import Resource | |||
|
26 | from sqlalchemy.orm import aliased | |||
|
27 | ||||
|
28 | log = logging.getLogger(__name__) | |||
|
29 | ||||
|
30 | ||||
|
31 | def generate_api_key(): | |||
|
32 | uid = str(uuid.uuid4()).replace('-', '') | |||
|
33 | return uid[0:32] | |||
|
34 | ||||
|
35 | ||||
|
36 | class Application(Resource): | |||
|
37 | """ | |||
|
38 | Resource of application type | |||
|
39 | """ | |||
|
40 | ||||
|
41 | __tablename__ = 'applications' | |||
|
42 | __mapper_args__ = {'polymorphic_identity': 'application'} | |||
|
43 | ||||
|
44 | # lists configurable possible permissions for this resource type | |||
|
45 | __possible_permissions__ = ('view', 'update_reports') | |||
|
46 | ||||
|
47 | resource_id = sa.Column(sa.Integer(), | |||
|
48 | sa.ForeignKey('resources.resource_id', | |||
|
49 | onupdate='CASCADE', | |||
|
50 | ondelete='CASCADE', ), | |||
|
51 | primary_key=True, ) | |||
|
52 | domains = sa.Column(sa.UnicodeText(), nullable=False, default='') | |||
|
53 | api_key = sa.Column(sa.String(32), nullable=False, unique=True, index=True, | |||
|
54 | default=generate_api_key) | |||
|
55 | public_key = sa.Column(sa.String(32), nullable=False, unique=True, | |||
|
56 | index=True, | |||
|
57 | default=generate_api_key) | |||
|
58 | default_grouping = sa.Column(sa.Unicode(20), nullable=False, | |||
|
59 | default='url_traceback') | |||
|
60 | error_report_threshold = sa.Column(sa.Integer(), default=10) | |||
|
61 | slow_report_threshold = sa.Column(sa.Integer(), default=10) | |||
|
62 | allow_permanent_storage = sa.Column(sa.Boolean(), default=False, | |||
|
63 | nullable=False) | |||
|
64 | ||||
|
65 | @sa.orm.validates('default_grouping') | |||
|
66 | def validate_default_grouping(self, key, grouping): | |||
|
67 | """ validate if resouce can have specific permission """ | |||
|
68 | assert grouping in ['url_type', 'url_traceback', 'traceback_server'] | |||
|
69 | return grouping | |||
|
70 | ||||
|
71 | report_groups = sa.orm.relationship('ReportGroup', | |||
|
72 | cascade="all, delete-orphan", | |||
|
73 | passive_deletes=True, | |||
|
74 | passive_updates=True, | |||
|
75 | lazy='dynamic', | |||
|
76 | backref=sa.orm.backref('application', | |||
|
77 | lazy="joined")) | |||
|
78 | ||||
|
79 | postprocess_conf = sa.orm.relationship('ApplicationPostprocessConf', | |||
|
80 | cascade="all, delete-orphan", | |||
|
81 | passive_deletes=True, | |||
|
82 | passive_updates=True, | |||
|
83 | backref='resource') | |||
|
84 | ||||
|
85 | logs = sa.orm.relationship('Log', | |||
|
86 | lazy='dynamic', | |||
|
87 | backref='application', | |||
|
88 | passive_deletes=True, | |||
|
89 | passive_updates=True, ) | |||
|
90 | ||||
|
91 | integrations = sa.orm.relationship('IntegrationBase', | |||
|
92 | backref='resource', | |||
|
93 | cascade="all, delete-orphan", | |||
|
94 | passive_deletes=True, | |||
|
95 | passive_updates=True, ) | |||
|
96 | ||||
|
97 | def generate_api_key(self): | |||
|
98 | return generate_api_key() | |||
|
99 | ||||
|
100 | ||||
|
101 | def after_update(mapper, connection, target): | |||
|
102 | from appenlight.models.services.application import ApplicationService | |||
|
103 | log.info('clearing out ApplicationService cache') | |||
|
104 | ApplicationService.by_id_cached().invalidate(target.resource_id) | |||
|
105 | ApplicationService.by_api_key_cached().invalidate(target.api_key) | |||
|
106 | ||||
|
107 | ||||
|
108 | sa.event.listen(Application, 'after_update', after_update) | |||
|
109 | sa.event.listen(Application, 'after_delete', after_update) |
@@ -0,0 +1,50 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from ziggurat_foundations.models.base import BaseModel | |||
|
23 | import sqlalchemy as sa | |||
|
24 | ||||
|
25 | from appenlight.models import Base | |||
|
26 | from appenlight.models.report_group import ReportGroup | |||
|
27 | ||||
|
28 | ||||
|
29 | class ApplicationPostprocessConf(Base, BaseModel): | |||
|
30 | """ | |||
|
31 | Stores prioritizing conditions for reports | |||
|
32 | This is later used for rule parsing like "if 10 occurences bump priority +1" | |||
|
33 | """ | |||
|
34 | ||||
|
35 | __tablename__ = 'application_postprocess_conf' | |||
|
36 | ||||
|
37 | pkey = sa.Column(sa.Integer(), nullable=False, primary_key=True) | |||
|
38 | resource_id = sa.Column(sa.Integer(), | |||
|
39 | sa.ForeignKey('resources.resource_id', | |||
|
40 | onupdate='CASCADE', | |||
|
41 | ondelete='CASCADE')) | |||
|
42 | do = sa.Column(sa.Unicode(25), nullable=False) | |||
|
43 | new_value = sa.Column(sa.UnicodeText(), nullable=False, default='') | |||
|
44 | rule = sa.Column(sa.dialects.postgresql.JSON, | |||
|
45 | nullable=False, default={'field': 'http_status', | |||
|
46 | "op": "ge", "value": "500"}) | |||
|
47 | ||||
|
48 | def postprocess(self, item): | |||
|
49 | new_value = int(self.new_value) | |||
|
50 | item.priority = ReportGroup.priority + new_value |
@@ -0,0 +1,57 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | import sqlalchemy as sa | |||
|
24 | ||||
|
25 | from datetime import datetime | |||
|
26 | from appenlight.models import Base | |||
|
27 | from ziggurat_foundations.models.base import BaseModel | |||
|
28 | from ziggurat_foundations.models.services.user import UserService | |||
|
29 | ||||
|
30 | log = logging.getLogger(__name__) | |||
|
31 | ||||
|
32 | ||||
|
33 | class AuthToken(Base, BaseModel): | |||
|
34 | """ | |||
|
35 | Stores information about possible alerting options | |||
|
36 | """ | |||
|
37 | __tablename__ = 'auth_tokens' | |||
|
38 | ||||
|
39 | id = sa.Column(sa.Integer, primary_key=True, nullable=False) | |||
|
40 | token = sa.Column(sa.Unicode(40), nullable=False, | |||
|
41 | default=lambda x: UserService.generate_random_string(40)) | |||
|
42 | owner_id = sa.Column(sa.Unicode(30), | |||
|
43 | sa.ForeignKey('users.id', onupdate='CASCADE', | |||
|
44 | ondelete='CASCADE')) | |||
|
45 | creation_date = sa.Column(sa.DateTime, default=lambda x: datetime.utcnow()) | |||
|
46 | expires = sa.Column(sa.DateTime) | |||
|
47 | description = sa.Column(sa.Unicode, default='') | |||
|
48 | ||||
|
49 | @property | |||
|
50 | def is_expired(self): | |||
|
51 | if self.expires: | |||
|
52 | return self.expires < datetime.utcnow() | |||
|
53 | else: | |||
|
54 | return False | |||
|
55 | ||||
|
56 | def __str__(self): | |||
|
57 | return '<AuthToken u:%s t:%s...>' % (self.owner_id, self.token[0:10]) |
@@ -0,0 +1,37 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import sqlalchemy as sa | |||
|
23 | from ziggurat_foundations.models.base import BaseModel | |||
|
24 | from sqlalchemy.dialects.postgres import JSON | |||
|
25 | ||||
|
26 | from . import Base | |||
|
27 | ||||
|
28 | ||||
|
29 | class Config(Base, BaseModel): | |||
|
30 | __tablename__ = 'config' | |||
|
31 | ||||
|
32 | key = sa.Column(sa.Unicode, primary_key=True) | |||
|
33 | section = sa.Column(sa.Unicode, primary_key=True) | |||
|
34 | value = sa.Column(JSON, nullable=False) | |||
|
35 | ||||
|
36 | def __json__(self, request): | |||
|
37 | return self.get_dict() |
@@ -0,0 +1,160 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import sqlalchemy as sa | |||
|
23 | import logging | |||
|
24 | ||||
|
25 | from datetime import datetime | |||
|
26 | from appenlight.models import Base, get_db_session | |||
|
27 | from appenlight.models.services.report_stat import ReportStatService | |||
|
28 | from appenlight.models.resource import Resource | |||
|
29 | from appenlight.models.integrations import IntegrationException | |||
|
30 | from pyramid.threadlocal import get_current_request | |||
|
31 | from sqlalchemy.dialects.postgresql import JSON | |||
|
32 | from ziggurat_foundations.models.base import BaseModel | |||
|
33 | ||||
|
34 | log = logging.getLogger(__name__) | |||
|
35 | ||||
|
36 | ||||
|
37 | class Event(Base, BaseModel): | |||
|
38 | __tablename__ = 'events' | |||
|
39 | ||||
|
40 | types = {'error_report_alert': 1, | |||
|
41 | 'slow_report_alert': 3, | |||
|
42 | 'comment': 5, | |||
|
43 | 'assignment': 6, | |||
|
44 | 'uptime_alert': 7, | |||
|
45 | 'chart_alert': 9} | |||
|
46 | ||||
|
47 | statuses = {'active': 1, | |||
|
48 | 'closed': 0} | |||
|
49 | ||||
|
50 | id = sa.Column(sa.Integer, primary_key=True) | |||
|
51 | start_date = sa.Column(sa.DateTime, default=datetime.utcnow) | |||
|
52 | end_date = sa.Column(sa.DateTime) | |||
|
53 | status = sa.Column(sa.Integer, default=1) | |||
|
54 | event_type = sa.Column(sa.Integer, default=1) | |||
|
55 | origin_user_id = sa.Column(sa.Integer(), sa.ForeignKey('users.id'), | |||
|
56 | nullable=True) | |||
|
57 | target_user_id = sa.Column(sa.Integer(), sa.ForeignKey('users.id'), | |||
|
58 | nullable=True) | |||
|
59 | resource_id = sa.Column(sa.Integer(), | |||
|
60 | sa.ForeignKey('resources.resource_id'), | |||
|
61 | nullable=True) | |||
|
62 | target_id = sa.Column(sa.Integer) | |||
|
63 | target_uuid = sa.Column(sa.Unicode(40)) | |||
|
64 | text = sa.Column(sa.UnicodeText()) | |||
|
65 | values = sa.Column(JSON(), nullable=False, default=None) | |||
|
66 | ||||
|
67 | def __repr__(self): | |||
|
68 | return '<Event %s, app:%s, %s>' % (self.unified_alert_name(), | |||
|
69 | self.resource_id, | |||
|
70 | self.unified_alert_action()) | |||
|
71 | ||||
|
72 | @property | |||
|
73 | def reverse_types(self): | |||
|
74 | return dict([(v, k) for k, v in self.types.items()]) | |||
|
75 | ||||
|
76 | def unified_alert_name(self): | |||
|
77 | return self.reverse_types[self.event_type] | |||
|
78 | ||||
|
79 | def unified_alert_action(self): | |||
|
80 | event_name = self.reverse_types[self.event_type] | |||
|
81 | if self.status == Event.statuses['closed']: | |||
|
82 | return "CLOSE" | |||
|
83 | if self.status != Event.statuses['closed']: | |||
|
84 | return "OPEN" | |||
|
85 | return event_name | |||
|
86 | ||||
|
87 | def send_alerts(self, request=None, resource=None, db_session=None): | |||
|
88 | """" Sends alerts to applicable channels """ | |||
|
89 | db_session = get_db_session(db_session) | |||
|
90 | db_session.flush() | |||
|
91 | if not resource: | |||
|
92 | resource = Resource.by_resource_id(self.resource_id) | |||
|
93 | if not request: | |||
|
94 | request = get_current_request() | |||
|
95 | if not resource: | |||
|
96 | return | |||
|
97 | users = set([p.user for p in resource.users_for_perm('view')]) | |||
|
98 | for user in users: | |||
|
99 | for channel in user.alert_channels: | |||
|
100 | if not channel.channel_validated or not channel.send_alerts: | |||
|
101 | continue | |||
|
102 | else: | |||
|
103 | try: | |||
|
104 | channel.notify_alert(resource=resource, | |||
|
105 | event=self, | |||
|
106 | user=user, | |||
|
107 | request=request) | |||
|
108 | except IntegrationException as e: | |||
|
109 | log.warning('%s' % e) | |||
|
110 | ||||
|
111 | def validate_or_close(self, since_when, db_session=None): | |||
|
112 | """ Checks if alerts should stay open or it's time to close them. | |||
|
113 | Generates close alert event if alerts get closed """ | |||
|
114 | event_types = [Event.types['error_report_alert'], | |||
|
115 | Event.types['slow_report_alert']] | |||
|
116 | app = Resource.by_resource_id(self.resource_id) | |||
|
117 | if self.event_type in event_types: | |||
|
118 | total = ReportStatService.count_by_type( | |||
|
119 | self.event_type, self.resource_id, since_when) | |||
|
120 | if Event.types['error_report_alert'] == self.event_type: | |||
|
121 | threshold = app.error_report_threshold | |||
|
122 | if Event.types['slow_report_alert'] == self.event_type: | |||
|
123 | threshold = app.slow_report_threshold | |||
|
124 | ||||
|
125 | if total < threshold: | |||
|
126 | self.close() | |||
|
127 | ||||
|
128 | def close(self, db_session=None): | |||
|
129 | """ | |||
|
130 | Closes an event and sends notification to affected users | |||
|
131 | """ | |||
|
132 | self.end_date = datetime.utcnow() | |||
|
133 | self.status = Event.statuses['closed'] | |||
|
134 | log.warning('ALERT: CLOSE: %s' % self) | |||
|
135 | self.send_alerts() | |||
|
136 | ||||
|
137 | def text_representation(self): | |||
|
138 | alert_type = self.unified_alert_name() | |||
|
139 | text = '' | |||
|
140 | if 'slow_report' in alert_type: | |||
|
141 | text += 'Slow report alert' | |||
|
142 | if 'error_report' in alert_type: | |||
|
143 | text += 'Exception report alert' | |||
|
144 | if 'uptime_alert' in alert_type: | |||
|
145 | text += 'Uptime alert' | |||
|
146 | if 'chart_alert' in alert_type: | |||
|
147 | text += 'Metrics value alert' | |||
|
148 | ||||
|
149 | alert_action = self.unified_alert_action() | |||
|
150 | if alert_action == 'OPEN': | |||
|
151 | text += ' got opened.' | |||
|
152 | if alert_action == 'CLOSE': | |||
|
153 | text += ' got closed.' | |||
|
154 | return text | |||
|
155 | ||||
|
156 | def get_dict(self, request=None): | |||
|
157 | dict_data = super(Event, self).get_dict() | |||
|
158 | dict_data['text'] = self.text_representation() | |||
|
159 | dict_data['resource_name'] = self.resource.resource_name | |||
|
160 | return dict_data |
@@ -0,0 +1,41 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import sqlalchemy as sa | |||
|
23 | from sqlalchemy.ext.declarative import declared_attr | |||
|
24 | from ziggurat_foundations.models.external_identity import ExternalIdentityMixin | |||
|
25 | ||||
|
26 | from appenlight.models import Base | |||
|
27 | from appenlight.lib.sqlalchemy_fields import EncryptedUnicode | |||
|
28 | ||||
|
29 | ||||
|
30 | class ExternalIdentity(ExternalIdentityMixin, Base): | |||
|
31 | @declared_attr | |||
|
32 | def access_token(self): | |||
|
33 | return sa.Column(EncryptedUnicode(255), default='') | |||
|
34 | ||||
|
35 | @declared_attr | |||
|
36 | def alt_token(self): | |||
|
37 | return sa.Column(EncryptedUnicode(255), default='') | |||
|
38 | ||||
|
39 | @declared_attr | |||
|
40 | def token_secret(self): | |||
|
41 | return sa.Column(EncryptedUnicode(255), default='') |
@@ -0,0 +1,50 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from ziggurat_foundations.models.group import GroupMixin | |||
|
23 | from appenlight.models import Base | |||
|
24 | ||||
|
25 | ||||
|
26 | class Group(GroupMixin, Base): | |||
|
27 | __possible_permissions__ = ('root_administration', | |||
|
28 | 'test_features', | |||
|
29 | 'admin_panel', | |||
|
30 | 'admin_users', | |||
|
31 | 'manage_partitions',) | |||
|
32 | ||||
|
33 | def get_dict(self, exclude_keys=None, include_keys=None, | |||
|
34 | include_perms=False): | |||
|
35 | result = super(Group, self).get_dict(exclude_keys, include_keys) | |||
|
36 | if include_perms: | |||
|
37 | result['possible_permissions'] = self.__possible_permissions__ | |||
|
38 | result['current_permissions'] = [p.perm_name for p in | |||
|
39 | self.permissions] | |||
|
40 | else: | |||
|
41 | result['possible_permissions'] = [] | |||
|
42 | result['current_permissions'] = [] | |||
|
43 | exclude_keys_list = exclude_keys or [] | |||
|
44 | include_keys_list = include_keys or [] | |||
|
45 | d = {} | |||
|
46 | for k in result.keys(): | |||
|
47 | if (k not in exclude_keys_list and | |||
|
48 | (k in include_keys_list or not include_keys)): | |||
|
49 | d[k] = result[k] | |||
|
50 | return d |
@@ -0,0 +1,27 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from ziggurat_foundations.models.group_permission import GroupPermissionMixin | |||
|
23 | from appenlight.models import Base | |||
|
24 | ||||
|
25 | ||||
|
26 | class GroupPermission(GroupPermissionMixin, Base): | |||
|
27 | pass |
@@ -0,0 +1,28 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from ziggurat_foundations.models.group_resource_permission import \ | |||
|
23 | GroupResourcePermissionMixin | |||
|
24 | from appenlight.models import Base | |||
|
25 | ||||
|
26 | ||||
|
27 | class GroupResourcePermission(GroupResourcePermissionMixin, Base): | |||
|
28 | pass |
@@ -0,0 +1,83 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import sqlalchemy as sa | |||
|
23 | from sqlalchemy.dialects.postgresql import JSON | |||
|
24 | from sqlalchemy.ext.hybrid import hybrid_property | |||
|
25 | from ziggurat_foundations.models.base import BaseModel | |||
|
26 | ||||
|
27 | from appenlight.lib.encryption import decrypt_dictionary_keys | |||
|
28 | from appenlight.lib.encryption import encrypt_dictionary_keys | |||
|
29 | from appenlight.models import Base, get_db_session | |||
|
30 | ||||
|
31 | ||||
|
32 | class IntegrationException(Exception): | |||
|
33 | pass | |||
|
34 | ||||
|
35 | ||||
|
36 | class IntegrationBase(Base, BaseModel): | |||
|
37 | """ | |||
|
38 | Model from which all integrations inherit using polymorphic approach | |||
|
39 | """ | |||
|
40 | __tablename__ = 'integrations' | |||
|
41 | ||||
|
42 | front_visible = False | |||
|
43 | as_alert_channel = False | |||
|
44 | supports_report_alerting = False | |||
|
45 | ||||
|
46 | id = sa.Column(sa.Integer, primary_key=True) | |||
|
47 | resource_id = sa.Column(sa.Integer, | |||
|
48 | sa.ForeignKey('applications.resource_id')) | |||
|
49 | integration_name = sa.Column(sa.Unicode(64)) | |||
|
50 | _config = sa.Column('config', JSON(), nullable=False, default='') | |||
|
51 | modified_date = sa.Column(sa.DateTime) | |||
|
52 | ||||
|
53 | channel = sa.orm.relationship('AlertChannel', | |||
|
54 | cascade="all,delete-orphan", | |||
|
55 | passive_deletes=True, | |||
|
56 | passive_updates=True, | |||
|
57 | uselist=False, | |||
|
58 | backref='integration') | |||
|
59 | ||||
|
60 | __mapper_args__ = { | |||
|
61 | 'polymorphic_on': 'integration_name', | |||
|
62 | 'polymorphic_identity': 'integration' | |||
|
63 | } | |||
|
64 | ||||
|
65 | @classmethod | |||
|
66 | def by_app_id_and_integration_name(cls, resource_id, integration_name, | |||
|
67 | db_session=None): | |||
|
68 | db_session = get_db_session(db_session) | |||
|
69 | query = db_session.query(cls) | |||
|
70 | query = query.filter(cls.integration_name == integration_name) | |||
|
71 | query = query.filter(cls.resource_id == resource_id) | |||
|
72 | return query.first() | |||
|
73 | ||||
|
74 | @hybrid_property | |||
|
75 | def config(self): | |||
|
76 | return decrypt_dictionary_keys(self._config) | |||
|
77 | ||||
|
78 | @config.setter | |||
|
79 | def config(self, value): | |||
|
80 | if not hasattr(value, 'items'): | |||
|
81 | raise Exception('IntegrationBase.config only accepts ' | |||
|
82 | 'flat dictionaries') | |||
|
83 | self._config = encrypt_dictionary_keys(value) |
@@ -0,0 +1,168 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import requests | |||
|
23 | from requests_oauthlib import OAuth1 | |||
|
24 | from appenlight.models.integrations import (IntegrationBase, | |||
|
25 | IntegrationException) | |||
|
26 | ||||
|
27 | _ = str | |||
|
28 | ||||
|
29 | ||||
|
30 | class NotFoundException(Exception): | |||
|
31 | pass | |||
|
32 | ||||
|
33 | ||||
|
34 | class BitbucketIntegration(IntegrationBase): | |||
|
35 | __mapper_args__ = { | |||
|
36 | 'polymorphic_identity': 'bitbucket' | |||
|
37 | } | |||
|
38 | front_visible = True | |||
|
39 | as_alert_channel = False | |||
|
40 | supports_report_alerting = False | |||
|
41 | action_notification = True | |||
|
42 | integration_action = 'Add issue to Bitbucket' | |||
|
43 | ||||
|
44 | @classmethod | |||
|
45 | def create_client(cls, request, user_name=None, repo_name=None): | |||
|
46 | """ | |||
|
47 | Creates REST client that can authenticate to specific repo | |||
|
48 | uses auth tokens for current request user | |||
|
49 | """ | |||
|
50 | config = request.registry.settings | |||
|
51 | token = None | |||
|
52 | secret = None | |||
|
53 | for identity in request.user.external_identities: | |||
|
54 | if identity.provider_name == 'bitbucket': | |||
|
55 | token = identity.access_token | |||
|
56 | secret = identity.token_secret | |||
|
57 | break | |||
|
58 | if not token: | |||
|
59 | raise IntegrationException( | |||
|
60 | 'No valid auth token present for this service') | |||
|
61 | client = BitbucketClient(token, secret, | |||
|
62 | user_name, | |||
|
63 | repo_name, | |||
|
64 | config['authomatic.pr.bitbucket.key'], | |||
|
65 | config['authomatic.pr.bitbucket.secret']) | |||
|
66 | return client | |||
|
67 | ||||
|
68 | ||||
|
69 | class BitbucketClient(object): | |||
|
70 | api_url = 'https://bitbucket.org/api/1.0' | |||
|
71 | repo_type = 'bitbucket' | |||
|
72 | ||||
|
73 | def __init__(self, token, secret, owner, repo_name, bitbucket_consumer_key, | |||
|
74 | bitbucket_consumer_secret): | |||
|
75 | self.access_token = token | |||
|
76 | self.token_secret = secret | |||
|
77 | self.owner = owner | |||
|
78 | self.repo_name = repo_name | |||
|
79 | self.bitbucket_consumer_key = bitbucket_consumer_key | |||
|
80 | self.bitbucket_consumer_secret = bitbucket_consumer_secret | |||
|
81 | ||||
|
82 | possible_keys = { | |||
|
83 | 'status': ['new', 'open', 'resolved', 'on hold', 'invalid', | |||
|
84 | 'duplicate', 'wontfix'], | |||
|
85 | 'priority': ['trivial', 'minor', 'major', 'critical', 'blocker'], | |||
|
86 | 'kind': ['bug', 'enhancement', 'proposal', 'task'] | |||
|
87 | } | |||
|
88 | ||||
|
89 | def get_statuses(self): | |||
|
90 | """Gets list of possible item statuses""" | |||
|
91 | return self.possible_keys['status'] | |||
|
92 | ||||
|
93 | def get_priorities(self): | |||
|
94 | """Gets list of possible item statuses""" | |||
|
95 | return self.possible_keys['priority'] | |||
|
96 | ||||
|
97 | def make_request(self, url, method='get', data=None, headers=None): | |||
|
98 | """ | |||
|
99 | Performs HTTP request to bitbucket | |||
|
100 | """ | |||
|
101 | auth = OAuth1(self.bitbucket_consumer_key, | |||
|
102 | self.bitbucket_consumer_secret, | |||
|
103 | self.access_token, self.token_secret) | |||
|
104 | try: | |||
|
105 | resp = getattr(requests, method)(url, data=data, auth=auth, | |||
|
106 | timeout=10) | |||
|
107 | except Exception as e: | |||
|
108 | raise IntegrationException( | |||
|
109 | _('Error communicating with Bitbucket: %s') % (e,)) | |||
|
110 | if resp.status_code == 401: | |||
|
111 | raise IntegrationException( | |||
|
112 | _('You are not authorized to access this repo')) | |||
|
113 | elif resp.status_code == 404: | |||
|
114 | raise IntegrationException(_('User or repo name are incorrect')) | |||
|
115 | elif resp.status_code not in [200, 201]: | |||
|
116 | raise IntegrationException( | |||
|
117 | _('Bitbucket response_code: %s') % resp.status_code) | |||
|
118 | try: | |||
|
119 | return resp.json() | |||
|
120 | except Exception as e: | |||
|
121 | raise IntegrationException( | |||
|
122 | _('Error decoding response from Bitbucket: %s') % (e,)) | |||
|
123 | ||||
|
124 | def get_assignees(self): | |||
|
125 | """Gets list of possible assignees""" | |||
|
126 | url = '%(api_url)s/privileges/%(owner)s/%(repo_name)s' % { | |||
|
127 | 'api_url': self.api_url, | |||
|
128 | 'owner': self.owner, | |||
|
129 | 'repo_name': self.repo_name} | |||
|
130 | ||||
|
131 | data = self.make_request(url) | |||
|
132 | results = [{'user': self.owner, 'name': '(Repo owner)'}] | |||
|
133 | if data: | |||
|
134 | for entry in data: | |||
|
135 | results.append({"user": entry['user']['username'], | |||
|
136 | "name": entry['user'].get('display_name')}) | |||
|
137 | return results | |||
|
138 | ||||
|
139 | def create_issue(self, form_data): | |||
|
140 | """ | |||
|
141 | Sends creates a new issue in tracker using REST call | |||
|
142 | """ | |||
|
143 | url = '%(api_url)s/repositories/%(owner)s/%(repo_name)s/issues/' % { | |||
|
144 | 'api_url': self.api_url, | |||
|
145 | 'owner': self.owner, | |||
|
146 | 'repo_name': self.repo_name} | |||
|
147 | ||||
|
148 | payload = { | |||
|
149 | "title": form_data['title'], | |||
|
150 | "content": form_data['content'], | |||
|
151 | "kind": form_data['kind'], | |||
|
152 | "priority": form_data['priority'], | |||
|
153 | "responsible": form_data['responsible'] | |||
|
154 | } | |||
|
155 | data = self.make_request(url, 'post', payload) | |||
|
156 | f_args = { | |||
|
157 | "owner": self.owner, | |||
|
158 | "repo_name": self.repo_name, | |||
|
159 | "issue_id": data['local_id'] | |||
|
160 | } | |||
|
161 | web_url = 'https://bitbucket.org/%(owner)s/%(repo_name)s' \ | |||
|
162 | '/issue/%(issue_id)s/issue-title' % f_args | |||
|
163 | to_return = { | |||
|
164 | 'id': data['local_id'], | |||
|
165 | 'resource_url': data['resource_uri'], | |||
|
166 | 'web_url': web_url | |||
|
167 | } | |||
|
168 | return to_return |
@@ -0,0 +1,79 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | ||||
|
24 | from requests.exceptions import HTTPError, ConnectionError | |||
|
25 | from camplight import Request, Campfire | |||
|
26 | from camplight.exceptions import CamplightException | |||
|
27 | ||||
|
28 | from appenlight.models.integrations import (IntegrationBase, | |||
|
29 | IntegrationException) | |||
|
30 | ||||
|
31 | _ = str | |||
|
32 | ||||
|
33 | log = logging.getLogger(__name__) | |||
|
34 | ||||
|
35 | ||||
|
36 | class NotFoundException(Exception): | |||
|
37 | pass | |||
|
38 | ||||
|
39 | ||||
|
40 | class CampfireIntegration(IntegrationBase): | |||
|
41 | __mapper_args__ = { | |||
|
42 | 'polymorphic_identity': 'campfire' | |||
|
43 | } | |||
|
44 | front_visible = False | |||
|
45 | as_alert_channel = True | |||
|
46 | supports_report_alerting = True | |||
|
47 | action_notification = True | |||
|
48 | integration_action = 'Message via Campfire' | |||
|
49 | ||||
|
50 | @classmethod | |||
|
51 | def create_client(cls, api_token, account): | |||
|
52 | client = CampfireClient(api_token, account) | |||
|
53 | return client | |||
|
54 | ||||
|
55 | ||||
|
56 | class CampfireClient(object): | |||
|
57 | def __init__(self, api_token, account): | |||
|
58 | request = Request('https://%s.campfirenow.com' % account, api_token) | |||
|
59 | self.campfire = Campfire(request) | |||
|
60 | ||||
|
61 | def get_account(self): | |||
|
62 | try: | |||
|
63 | return self.campfire.account() | |||
|
64 | except (HTTPError, CamplightException) as e: | |||
|
65 | raise IntegrationException(str(e)) | |||
|
66 | ||||
|
67 | def get_rooms(self): | |||
|
68 | try: | |||
|
69 | return self.campfire.rooms() | |||
|
70 | except (HTTPError, CamplightException) as e: | |||
|
71 | raise IntegrationException(str(e)) | |||
|
72 | ||||
|
73 | def speak_to_room(self, room, message, sound='RIMSHOT'): | |||
|
74 | try: | |||
|
75 | room = self.campfire.room(room) | |||
|
76 | room.join() | |||
|
77 | room.speak(message, type_='TextMessage') | |||
|
78 | except (HTTPError, CamplightException, ConnectionError) as e: | |||
|
79 | raise IntegrationException(str(e)) |
@@ -0,0 +1,87 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | ||||
|
24 | import requests | |||
|
25 | from requests.auth import HTTPBasicAuth | |||
|
26 | import simplejson as json | |||
|
27 | ||||
|
28 | from appenlight.models.integrations import (IntegrationBase, | |||
|
29 | IntegrationException) | |||
|
30 | ||||
|
31 | _ = str | |||
|
32 | ||||
|
33 | log = logging.getLogger(__name__) | |||
|
34 | ||||
|
35 | ||||
|
36 | class NotFoundException(Exception): | |||
|
37 | pass | |||
|
38 | ||||
|
39 | ||||
|
40 | class FlowdockIntegration(IntegrationBase): | |||
|
41 | __mapper_args__ = { | |||
|
42 | 'polymorphic_identity': 'flowdock' | |||
|
43 | } | |||
|
44 | front_visible = False | |||
|
45 | as_alert_channel = True | |||
|
46 | supports_report_alerting = True | |||
|
47 | action_notification = True | |||
|
48 | integration_action = 'Message via Flowdock' | |||
|
49 | ||||
|
50 | @classmethod | |||
|
51 | def create_client(cls, api_token): | |||
|
52 | client = FlowdockClient(api_token) | |||
|
53 | return client | |||
|
54 | ||||
|
55 | ||||
|
56 | class FlowdockClient(object): | |||
|
57 | def __init__(self, api_token): | |||
|
58 | self.auth = HTTPBasicAuth(api_token, '') | |||
|
59 | self.api_token = api_token | |||
|
60 | self.api_url = 'https://api.flowdock.com/v1/messages' | |||
|
61 | ||||
|
62 | def make_request(self, url, method='get', data=None): | |||
|
63 | headers = { | |||
|
64 | 'Content-Type': 'application/json', | |||
|
65 | 'User-Agent': 'appenlight-flowdock', | |||
|
66 | } | |||
|
67 | try: | |||
|
68 | if data: | |||
|
69 | data = json.dumps(data) | |||
|
70 | resp = getattr(requests, method)(url, data=data, headers=headers, | |||
|
71 | timeout=10) | |||
|
72 | except Exception as e: | |||
|
73 | raise IntegrationException( | |||
|
74 | _('Error communicating with Flowdock: %s') % (e,)) | |||
|
75 | if resp.status_code > 299: | |||
|
76 | raise IntegrationException(resp.text) | |||
|
77 | return resp | |||
|
78 | ||||
|
79 | def send_to_chat(self, payload): | |||
|
80 | url = '%(api_url)s/chat/%(api_token)s' % {'api_url': self.api_url, | |||
|
81 | 'api_token': self.api_token} | |||
|
82 | return self.make_request(url, method='post', data=payload).json() | |||
|
83 | ||||
|
84 | def send_to_inbox(self, payload): | |||
|
85 | f_args = {'api_url': self.api_url, 'api_token': self.api_token} | |||
|
86 | url = '%(api_url)s/team_inbox/%(api_token)s' % f_args | |||
|
87 | return self.make_request(url, method='post', data=payload).json() |
@@ -0,0 +1,161 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import json | |||
|
23 | import requests | |||
|
24 | ||||
|
25 | from . import IntegrationBase, IntegrationException | |||
|
26 | ||||
|
27 | _ = str | |||
|
28 | ||||
|
29 | ||||
|
30 | class GithubAuthException(Exception): | |||
|
31 | pass | |||
|
32 | ||||
|
33 | ||||
|
34 | class GithubIntegration(IntegrationBase): | |||
|
35 | __mapper_args__ = { | |||
|
36 | 'polymorphic_identity': 'github' | |||
|
37 | } | |||
|
38 | front_visible = True | |||
|
39 | as_alert_channel = False | |||
|
40 | supports_report_alerting = False | |||
|
41 | action_notification = True | |||
|
42 | integration_action = 'Add issue to Github' | |||
|
43 | ||||
|
44 | @classmethod | |||
|
45 | def create_client(cls, request, user_name=None, repo_name=None): | |||
|
46 | """ | |||
|
47 | Creates REST client that can authenticate to specific repo | |||
|
48 | uses auth tokens for current request user | |||
|
49 | """ | |||
|
50 | token = None | |||
|
51 | secret = None | |||
|
52 | for identity in request.user.external_identities: | |||
|
53 | if identity.provider_name == 'github': | |||
|
54 | token = identity.access_token | |||
|
55 | secret = identity.token_secret | |||
|
56 | break | |||
|
57 | if not token: | |||
|
58 | raise IntegrationException( | |||
|
59 | 'No valid auth token present for this service') | |||
|
60 | client = GithubClient(token=token, owner=user_name, name=repo_name) | |||
|
61 | return client | |||
|
62 | ||||
|
63 | ||||
|
64 | class GithubClient(object): | |||
|
65 | api_url = 'https://api.github.com' | |||
|
66 | repo_type = 'github' | |||
|
67 | ||||
|
68 | def __init__(self, token, owner, name): | |||
|
69 | self.access_token = token | |||
|
70 | self.owner = owner | |||
|
71 | self.name = name | |||
|
72 | ||||
|
73 | def make_request(self, url, method='get', data=None, headers=None): | |||
|
74 | req_headers = {'User-Agent': 'appenlight', | |||
|
75 | 'Content-Type': 'application/json', | |||
|
76 | 'Authorization': 'token %s' % self.access_token} | |||
|
77 | try: | |||
|
78 | if data: | |||
|
79 | data = json.dumps(data) | |||
|
80 | resp = getattr(requests, method)(url, data=data, | |||
|
81 | headers=req_headers, | |||
|
82 | timeout=10) | |||
|
83 | except Exception as e: | |||
|
84 | msg = 'Error communicating with Github: %s' | |||
|
85 | raise IntegrationException(_(msg) % (e,)) | |||
|
86 | ||||
|
87 | if resp.status_code == 404: | |||
|
88 | msg = 'User or repo name are incorrect' | |||
|
89 | raise IntegrationException(_(msg)) | |||
|
90 | if resp.status_code == 401: | |||
|
91 | msg = 'You are not authorized to access this repo' | |||
|
92 | raise IntegrationException(_(msg)) | |||
|
93 | elif resp.status_code not in [200, 201]: | |||
|
94 | msg = 'Github response_code: %s' | |||
|
95 | raise IntegrationException(_(msg) % resp.status_code) | |||
|
96 | try: | |||
|
97 | return resp.json() | |||
|
98 | except Exception as e: | |||
|
99 | msg = 'Error decoding response from Github: %s' | |||
|
100 | raise IntegrationException(_(msg) % (e,)) | |||
|
101 | ||||
|
102 | def get_statuses(self): | |||
|
103 | """Gets list of possible item statuses""" | |||
|
104 | url = '%(api_url)s/repos/%(owner)s/%(name)s/labels' % { | |||
|
105 | 'api_url': self.api_url, | |||
|
106 | 'owner': self.owner, | |||
|
107 | 'name': self.name} | |||
|
108 | ||||
|
109 | data = self.make_request(url) | |||
|
110 | ||||
|
111 | statuses = [] | |||
|
112 | for status in data: | |||
|
113 | statuses.append(status['name']) | |||
|
114 | return statuses | |||
|
115 | ||||
|
116 | def get_repo(self): | |||
|
117 | """Gets list of possible item statuses""" | |||
|
118 | url = '%(api_url)s/repos/%(owner)s/%(name)s' % { | |||
|
119 | 'api_url': self.api_url, | |||
|
120 | 'owner': self.owner, | |||
|
121 | 'name': self.name} | |||
|
122 | ||||
|
123 | data = self.make_request(url) | |||
|
124 | return data | |||
|
125 | ||||
|
126 | def get_assignees(self): | |||
|
127 | """Gets list of possible assignees""" | |||
|
128 | url = '%(api_url)s/repos/%(owner)s/%(name)s/collaborators' % { | |||
|
129 | 'api_url': self.api_url, | |||
|
130 | 'owner': self.owner, | |||
|
131 | 'name': self.name} | |||
|
132 | data = self.make_request(url) | |||
|
133 | results = [] | |||
|
134 | for entry in data: | |||
|
135 | results.append({"user": entry['login'], | |||
|
136 | "name": entry.get('name')}) | |||
|
137 | return results | |||
|
138 | ||||
|
139 | def create_issue(self, form_data): | |||
|
140 | """ | |||
|
141 | Make a REST call to create issue in Github's issue tracker | |||
|
142 | """ | |||
|
143 | url = '%(api_url)s/repos/%(owner)s/%(name)s/issues' % { | |||
|
144 | 'api_url': self.api_url, | |||
|
145 | 'owner': self.owner, | |||
|
146 | 'name': self.name} | |||
|
147 | ||||
|
148 | payload = { | |||
|
149 | "title": form_data['title'], | |||
|
150 | "body": form_data['content'], | |||
|
151 | "labels": [], | |||
|
152 | "assignee": form_data['responsible'] | |||
|
153 | } | |||
|
154 | payload['labels'].extend(form_data['kind']) | |||
|
155 | data = self.make_request(url, 'post', data=payload) | |||
|
156 | to_return = { | |||
|
157 | 'id': data['number'], | |||
|
158 | 'resource_url': data['url'], | |||
|
159 | 'web_url': data['html_url'] | |||
|
160 | } | |||
|
161 | return to_return |
@@ -0,0 +1,88 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | ||||
|
24 | import requests | |||
|
25 | ||||
|
26 | from . import IntegrationBase, IntegrationException | |||
|
27 | ||||
|
28 | _ = str | |||
|
29 | ||||
|
30 | log = logging.getLogger(__name__) | |||
|
31 | ||||
|
32 | ||||
|
33 | class NotFoundException(Exception): | |||
|
34 | pass | |||
|
35 | ||||
|
36 | ||||
|
37 | class HipchatIntegration(IntegrationBase): | |||
|
38 | __mapper_args__ = { | |||
|
39 | 'polymorphic_identity': 'hipchat' | |||
|
40 | } | |||
|
41 | front_visible = False | |||
|
42 | as_alert_channel = True | |||
|
43 | supports_report_alerting = True | |||
|
44 | action_notification = True | |||
|
45 | integration_action = 'Message via Hipchat' | |||
|
46 | ||||
|
47 | @classmethod | |||
|
48 | def create_client(cls, api_token): | |||
|
49 | client = HipchatClient(api_token) | |||
|
50 | return client | |||
|
51 | ||||
|
52 | ||||
|
53 | class HipchatClient(object): | |||
|
54 | def __init__(self, api_token): | |||
|
55 | self.api_token = api_token | |||
|
56 | self.api_url = 'https://api.hipchat.com/v1' | |||
|
57 | ||||
|
58 | def make_request(self, endpoint, method='get', data=None): | |||
|
59 | headers = { | |||
|
60 | 'User-Agent': 'appenlight-hipchat', | |||
|
61 | } | |||
|
62 | url = '%s%s' % (self.api_url, endpoint) | |||
|
63 | params = { | |||
|
64 | 'format': 'json', | |||
|
65 | 'auth_token': self.api_token | |||
|
66 | } | |||
|
67 | try: | |||
|
68 | resp = getattr(requests, method)(url, data=data, headers=headers, | |||
|
69 | params=params, | |||
|
70 | timeout=3) | |||
|
71 | except Exception as e: | |||
|
72 | msg = 'Error communicating with Hipchat: %s' | |||
|
73 | raise IntegrationException(_(msg) % (e,)) | |||
|
74 | if resp.status_code == 404: | |||
|
75 | msg = 'Error communicating with Hipchat - Room not found' | |||
|
76 | raise IntegrationException(msg) | |||
|
77 | elif resp.status_code != requests.codes.ok: | |||
|
78 | msg = 'Error communicating with Hipchat - status code: %s' | |||
|
79 | raise IntegrationException(msg % resp.status_code) | |||
|
80 | return resp | |||
|
81 | ||||
|
82 | def get_rooms(self): | |||
|
83 | # not used with notification api token | |||
|
84 | return self.make_request('/rooms/list') | |||
|
85 | ||||
|
86 | def send(self, payload): | |||
|
87 | return self.make_request('/rooms/message', method='post', | |||
|
88 | data=payload).json() |
@@ -0,0 +1,133 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import jira | |||
|
23 | from appenlight.models.integrations import (IntegrationBase, | |||
|
24 | IntegrationException) | |||
|
25 | ||||
|
26 | _ = str | |||
|
27 | ||||
|
28 | ||||
|
29 | class NotFoundException(Exception): | |||
|
30 | pass | |||
|
31 | ||||
|
32 | ||||
|
33 | class JiraIntegration(IntegrationBase): | |||
|
34 | __mapper_args__ = { | |||
|
35 | 'polymorphic_identity': 'jira' | |||
|
36 | } | |||
|
37 | front_visible = True | |||
|
38 | as_alert_channel = False | |||
|
39 | supports_report_alerting = False | |||
|
40 | action_notification = True | |||
|
41 | integration_action = 'Add issue to Jira' | |||
|
42 | ||||
|
43 | ||||
|
44 | class JiraClient(object): | |||
|
45 | def __init__(self, user_name, password, host_name, project, request=None): | |||
|
46 | self.user_name = user_name | |||
|
47 | self.password = password | |||
|
48 | self.host_name = host_name | |||
|
49 | self.project = project | |||
|
50 | self.request = request | |||
|
51 | try: | |||
|
52 | self.client = jira.client.JIRA(options={'server': host_name}, | |||
|
53 | basic_auth=(user_name, password)) | |||
|
54 | except jira.JIRAError as e: | |||
|
55 | raise IntegrationException( | |||
|
56 | 'Communication problem: HTTP_STATUS:%s, URL:%s ' % ( | |||
|
57 | e.status_code, e.url)) | |||
|
58 | ||||
|
59 | def get_projects(self): | |||
|
60 | projects = self.client.projects() | |||
|
61 | return projects | |||
|
62 | ||||
|
63 | def get_assignees(self): | |||
|
64 | """Gets list of possible assignees""" | |||
|
65 | users = self.client.search_assignable_users_for_issues( | |||
|
66 | None, project=self.project) | |||
|
67 | results = [] | |||
|
68 | for user in users: | |||
|
69 | results.append({"id": user.name, "name": user.displayName}) | |||
|
70 | return results | |||
|
71 | ||||
|
72 | def get_metadata(self): | |||
|
73 | def cached(project_name): | |||
|
74 | metadata = self.client.createmeta( | |||
|
75 | projectKeys=project_name, expand='projects.issuetypes.fields') | |||
|
76 | assignees = self.get_assignees() | |||
|
77 | parsed_metadata = [] | |||
|
78 | for entry in metadata['projects'][0]['issuetypes']: | |||
|
79 | issue = {"name": entry['name'], | |||
|
80 | "id": entry['id'], | |||
|
81 | "fields": []} | |||
|
82 | for i_id, field_i in entry['fields'].items(): | |||
|
83 | field = { | |||
|
84 | "name": field_i['name'], | |||
|
85 | "id": i_id, | |||
|
86 | "required": field_i['required'], | |||
|
87 | "values": [], | |||
|
88 | "type": field_i['schema'].get('type') | |||
|
89 | } | |||
|
90 | if field_i.get('allowedValues'): | |||
|
91 | field['values'] = [] | |||
|
92 | for i in field_i['allowedValues']: | |||
|
93 | field['values'].append( | |||
|
94 | {'id': i['id'], | |||
|
95 | 'name': i.get('name', i.get('value', '')) | |||
|
96 | }) | |||
|
97 | if field['id'] == 'assignee': | |||
|
98 | field['values'] = assignees | |||
|
99 | ||||
|
100 | issue['fields'].append(field) | |||
|
101 | parsed_metadata.append(issue) | |||
|
102 | return parsed_metadata | |||
|
103 | ||||
|
104 | return cached(self.project) | |||
|
105 | ||||
|
106 | def create_issue(self, form_data): | |||
|
107 | metadata = self.get_metadata() | |||
|
108 | payload = { | |||
|
109 | 'project': {'key': form_data['project']}, | |||
|
110 | 'summary': form_data['title'], | |||
|
111 | 'description': form_data['content'], | |||
|
112 | 'issuetype': {'id': '1'}, | |||
|
113 | "priority": {'id': form_data['priority']}, | |||
|
114 | "assignee": {'name': form_data['responsible']}, | |||
|
115 | } | |||
|
116 | for issue_type in metadata: | |||
|
117 | if issue_type['id'] == '1': | |||
|
118 | for field in issue_type['fields']: | |||
|
119 | if field == 'reporter': | |||
|
120 | payload["reporter"] = {'id': self.user_name}, | |||
|
121 | if field['required'] and field['id'] not in payload: | |||
|
122 | if field['type'] == 'array': | |||
|
123 | payload[field['id']] = [field['values'][0], ] | |||
|
124 | elif field['type'] == 'string': | |||
|
125 | payload[field['id']] = '' | |||
|
126 | new_issue = self.client.create_issue(fields=payload) | |||
|
127 | web_url = self.host_name + '/browse/' + new_issue.key | |||
|
128 | to_return = { | |||
|
129 | 'id': new_issue.id, | |||
|
130 | 'resource_url': new_issue.self, | |||
|
131 | 'web_url': web_url | |||
|
132 | } | |||
|
133 | return to_return |
@@ -0,0 +1,79 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | ||||
|
24 | import requests | |||
|
25 | ||||
|
26 | from appenlight.models.integrations import (IntegrationBase, | |||
|
27 | IntegrationException) | |||
|
28 | from appenlight.lib.ext_json import json | |||
|
29 | ||||
|
30 | _ = str | |||
|
31 | ||||
|
32 | log = logging.getLogger(__name__) | |||
|
33 | ||||
|
34 | ||||
|
35 | class NotFoundException(Exception): | |||
|
36 | pass | |||
|
37 | ||||
|
38 | ||||
|
39 | class SlackIntegration(IntegrationBase): | |||
|
40 | __mapper_args__ = { | |||
|
41 | 'polymorphic_identity': 'slack' | |||
|
42 | } | |||
|
43 | front_visible = False | |||
|
44 | as_alert_channel = True | |||
|
45 | supports_report_alerting = True | |||
|
46 | action_notification = True | |||
|
47 | integration_action = 'Message via Slack' | |||
|
48 | ||||
|
49 | @classmethod | |||
|
50 | def create_client(cls, api_token): | |||
|
51 | client = SlackClient(api_token) | |||
|
52 | return client | |||
|
53 | ||||
|
54 | ||||
|
55 | class SlackClient(object): | |||
|
56 | def __init__(self, api_url): | |||
|
57 | self.api_url = api_url | |||
|
58 | ||||
|
59 | def make_request(self, data=None): | |||
|
60 | headers = { | |||
|
61 | 'User-Agent': 'appenlight-slack', | |||
|
62 | 'Content-Type': 'application/json' | |||
|
63 | } | |||
|
64 | try: | |||
|
65 | resp = getattr(requests, 'post')(self.api_url, | |||
|
66 | data=json.dumps(data), | |||
|
67 | headers=headers, | |||
|
68 | timeout=3) | |||
|
69 | except Exception as e: | |||
|
70 | raise IntegrationException( | |||
|
71 | _('Error communicating with Slack: %s') % (e,)) | |||
|
72 | if resp.status_code != requests.codes.ok: | |||
|
73 | msg = 'Error communicating with Slack - status code: %s' | |||
|
74 | raise IntegrationException(msg % resp.status_code) | |||
|
75 | return resp | |||
|
76 | ||||
|
77 | def send(self, payload): | |||
|
78 | return self.make_request('/rooms/message', method='post', | |||
|
79 | data=payload).json() |
@@ -0,0 +1,143 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | ||||
|
24 | import requests | |||
|
25 | ||||
|
26 | from appenlight.models.integrations import (IntegrationBase, | |||
|
27 | IntegrationException) | |||
|
28 | from appenlight.models.alert_channel import AlertChannel | |||
|
29 | from appenlight.lib.ext_json import json | |||
|
30 | ||||
|
31 | _ = str | |||
|
32 | ||||
|
33 | log = logging.getLogger(__name__) | |||
|
34 | ||||
|
35 | ||||
|
36 | class NotFoundException(Exception): | |||
|
37 | pass | |||
|
38 | ||||
|
39 | ||||
|
40 | class WebhooksIntegration(IntegrationBase): | |||
|
41 | __mapper_args__ = { | |||
|
42 | 'polymorphic_identity': 'webhooks' | |||
|
43 | } | |||
|
44 | front_visible = False | |||
|
45 | as_alert_channel = True | |||
|
46 | supports_report_alerting = True | |||
|
47 | action_notification = True | |||
|
48 | integration_action = 'Message via Webhooks' | |||
|
49 | ||||
|
50 | @classmethod | |||
|
51 | def create_client(cls, url): | |||
|
52 | client = WebhooksClient(url) | |||
|
53 | return client | |||
|
54 | ||||
|
55 | ||||
|
56 | class WebhooksClient(object): | |||
|
57 | def __init__(self, url): | |||
|
58 | self.api_url = url | |||
|
59 | ||||
|
60 | def make_request(self, url, method='get', data=None): | |||
|
61 | headers = { | |||
|
62 | 'Content-Type': 'application/json', | |||
|
63 | 'User-Agent': 'appenlight-webhooks', | |||
|
64 | } | |||
|
65 | try: | |||
|
66 | if data: | |||
|
67 | data = json.dumps(data) | |||
|
68 | resp = getattr(requests, method)(url, data=data, headers=headers, | |||
|
69 | timeout=3) | |||
|
70 | except Exception as e: | |||
|
71 | raise IntegrationException( | |||
|
72 | _('Error communicating with Webhooks: %s').format(e)) | |||
|
73 | if resp.status_code > 299: | |||
|
74 | raise IntegrationException( | |||
|
75 | 'Error communicating with Webhooks - status code: %s'.format( | |||
|
76 | resp.status_code)) | |||
|
77 | return resp | |||
|
78 | ||||
|
79 | def send_to_hook(self, payload): | |||
|
80 | return self.make_request(self.api_url, method='post', | |||
|
81 | data=payload).json() | |||
|
82 | ||||
|
83 | ||||
|
84 | class WebhooksAlertChannel(AlertChannel): | |||
|
85 | __mapper_args__ = { | |||
|
86 | 'polymorphic_identity': 'webhooks' | |||
|
87 | } | |||
|
88 | ||||
|
89 | def notify_reports(self, **kwargs): | |||
|
90 | """ | |||
|
91 | Notify user of individual reports | |||
|
92 | ||||
|
93 | kwargs: | |||
|
94 | application: application that the event applies for, | |||
|
95 | user: user that should be notified | |||
|
96 | request: request object | |||
|
97 | since_when: reports are newer than this time value, | |||
|
98 | reports: list of reports to render | |||
|
99 | ||||
|
100 | """ | |||
|
101 | template_vars = self.get_notification_basic_vars(kwargs) | |||
|
102 | payload = [] | |||
|
103 | include_keys = ('id', 'http_status', 'report_type', 'resource_name', | |||
|
104 | 'front_url', 'resource_id', 'error', 'url_path', | |||
|
105 | 'tags', 'duration') | |||
|
106 | ||||
|
107 | for occurences, report in kwargs['reports']: | |||
|
108 | r_dict = report.last_report_ref.get_dict(kwargs['request'], | |||
|
109 | include_keys=include_keys) | |||
|
110 | r_dict['group']['occurences'] = occurences | |||
|
111 | payload.append(r_dict) | |||
|
112 | client = WebhooksIntegration.create_client( | |||
|
113 | self.integration.config['reports_webhook']) | |||
|
114 | client.send_to_hook(payload) | |||
|
115 | ||||
|
116 | def notify_alert(self, **kwargs): | |||
|
117 | """ | |||
|
118 | Notify user of report or uptime threshold events based on events alert type | |||
|
119 | ||||
|
120 | Kwargs: | |||
|
121 | application: application that the event applies for, | |||
|
122 | event: event that is notified, | |||
|
123 | user: user that should be notified | |||
|
124 | request: request object | |||
|
125 | ||||
|
126 | """ | |||
|
127 | payload = { | |||
|
128 | 'alert_action': kwargs['event'].unified_alert_action(), | |||
|
129 | 'alert_name': kwargs['event'].unified_alert_name(), | |||
|
130 | 'event_time': kwargs['event'].end_date or kwargs[ | |||
|
131 | 'event'].start_date, | |||
|
132 | 'resource_name': None, | |||
|
133 | 'resource_id': None | |||
|
134 | } | |||
|
135 | if kwargs['event'].values and kwargs['event'].values.get('reports'): | |||
|
136 | payload['reports'] = kwargs['event'].values.get('reports', []) | |||
|
137 | if 'application' in kwargs: | |||
|
138 | payload['resource_name'] = kwargs['application'].resource_name | |||
|
139 | payload['resource_id'] = kwargs['application'].resource_id | |||
|
140 | ||||
|
141 | client = WebhooksIntegration.create_client( | |||
|
142 | self.integration.config['alerts_webhook']) | |||
|
143 | client.send_to_hook(payload) |
@@ -0,0 +1,135 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import sqlalchemy as sa | |||
|
23 | import logging | |||
|
24 | import hashlib | |||
|
25 | ||||
|
26 | from datetime import datetime | |||
|
27 | from appenlight.models import Base | |||
|
28 | from appenlight.lib.utils import convert_es_type | |||
|
29 | from appenlight.lib.enums import LogLevel | |||
|
30 | from sqlalchemy.dialects.postgresql import JSON | |||
|
31 | from ziggurat_foundations.models.base import BaseModel | |||
|
32 | ||||
|
33 | log = logging.getLogger(__name__) | |||
|
34 | ||||
|
35 | ||||
|
36 | class Log(Base, BaseModel): | |||
|
37 | __tablename__ = 'logs' | |||
|
38 | __table_args__ = {'implicit_returning': False} | |||
|
39 | ||||
|
40 | log_id = sa.Column(sa.BigInteger(), nullable=False, primary_key=True) | |||
|
41 | resource_id = sa.Column(sa.Integer(), | |||
|
42 | sa.ForeignKey('applications.resource_id', | |||
|
43 | onupdate='CASCADE', | |||
|
44 | ondelete='CASCADE'), | |||
|
45 | nullable=False, | |||
|
46 | index=True) | |||
|
47 | log_level = sa.Column(sa.Unicode, nullable=False, index=True, | |||
|
48 | default='INFO') | |||
|
49 | message = sa.Column(sa.UnicodeText(), default='') | |||
|
50 | timestamp = sa.Column(sa.DateTime(), default=datetime.utcnow, | |||
|
51 | server_default=sa.func.now()) | |||
|
52 | request_id = sa.Column(sa.Unicode()) | |||
|
53 | namespace = sa.Column(sa.Unicode()) | |||
|
54 | primary_key = sa.Column(sa.Unicode()) | |||
|
55 | ||||
|
56 | tags = sa.Column(JSON(), default={}) | |||
|
57 | permanent = sa.Column(sa.Boolean(), nullable=False, default=False) | |||
|
58 | ||||
|
59 | def __str__(self): | |||
|
60 | return self.__unicode__().encode('utf8') | |||
|
61 | ||||
|
62 | def __unicode__(self): | |||
|
63 | return '<Log id:%s, lv:%s, ns:%s >' % ( | |||
|
64 | self.log_id, self.log_level, self.namespace) | |||
|
65 | ||||
|
66 | def set_data(self, data, resource): | |||
|
67 | level = data.get('log_level').upper() | |||
|
68 | self.log_level = getattr(LogLevel, level, LogLevel.UNKNOWN) | |||
|
69 | self.message = data.get('message', '') | |||
|
70 | server_name = data.get('server', '').lower() or 'unknown' | |||
|
71 | self.tags = { | |||
|
72 | 'server_name': server_name | |||
|
73 | } | |||
|
74 | if data.get('tags'): | |||
|
75 | for tag_tuple in data['tags']: | |||
|
76 | self.tags[tag_tuple[0]] = tag_tuple[1] | |||
|
77 | self.timestamp = data['date'] | |||
|
78 | r_id = data.get('request_id', '') | |||
|
79 | if not r_id: | |||
|
80 | r_id = '' | |||
|
81 | self.request_id = r_id.replace('-', '') | |||
|
82 | self.resource_id = resource.resource_id | |||
|
83 | self.namespace = data.get('namespace') or '' | |||
|
84 | self.permanent = data.get('permanent') | |||
|
85 | self.primary_key = data.get('primary_key') | |||
|
86 | if self.primary_key is not None: | |||
|
87 | self.tags['appenlight_primary_key'] = self.primary_key | |||
|
88 | ||||
|
89 | def get_dict(self): | |||
|
90 | instance_dict = super(Log, self).get_dict() | |||
|
91 | instance_dict['log_level'] = LogLevel.key_from_value(self.log_level) | |||
|
92 | instance_dict['resource_name'] = self.application.resource_name | |||
|
93 | return instance_dict | |||
|
94 | ||||
|
95 | @property | |||
|
96 | def delete_hash(self): | |||
|
97 | if not self.primary_key: | |||
|
98 | return None | |||
|
99 | ||||
|
100 | to_hash = '{}_{}_{}'.format(self.resource_id, self.primary_key, | |||
|
101 | self.namespace) | |||
|
102 | return hashlib.sha1(to_hash.encode('utf8')).hexdigest() | |||
|
103 | ||||
|
104 | def es_doc(self): | |||
|
105 | tags = {} | |||
|
106 | tag_list = [] | |||
|
107 | for name, value in self.tags.items(): | |||
|
108 | # replace dot in indexed tag name | |||
|
109 | name = name.replace('.', '_') | |||
|
110 | tag_list.append(name) | |||
|
111 | tags[name] = { | |||
|
112 | "values": convert_es_type(value), | |||
|
113 | "numeric_values": value if ( | |||
|
114 | isinstance(value, (int, float)) and | |||
|
115 | not isinstance(value, bool)) else None | |||
|
116 | } | |||
|
117 | return { | |||
|
118 | 'pg_id': str(self.log_id), | |||
|
119 | 'delete_hash': self.delete_hash, | |||
|
120 | 'resource_id': self.resource_id, | |||
|
121 | 'request_id': self.request_id, | |||
|
122 | 'log_level': LogLevel.key_from_value(self.log_level), | |||
|
123 | 'timestamp': self.timestamp, | |||
|
124 | 'message': self.message if self.message else '', | |||
|
125 | 'namespace': self.namespace if self.namespace else '', | |||
|
126 | 'tags': tags, | |||
|
127 | 'tag_list': tag_list | |||
|
128 | } | |||
|
129 | ||||
|
130 | @property | |||
|
131 | def partition_id(self): | |||
|
132 | if self.permanent: | |||
|
133 | return 'rcae_l_%s' % self.timestamp.strftime('%Y_%m') | |||
|
134 | else: | |||
|
135 | return 'rcae_l_%s' % self.timestamp.strftime('%Y_%m_%d') |
@@ -0,0 +1,45 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import sqlalchemy as sa | |||
|
23 | from ziggurat_foundations.models.base import BaseModel | |||
|
24 | from sqlalchemy.dialects.postgres import JSON | |||
|
25 | ||||
|
26 | from . import Base | |||
|
27 | ||||
|
28 | ||||
|
29 | class PluginConfig(Base, BaseModel): | |||
|
30 | __tablename__ = 'plugin_configs' | |||
|
31 | ||||
|
32 | id = sa.Column(sa.Integer, primary_key=True) | |||
|
33 | plugin_name = sa.Column(sa.Unicode) | |||
|
34 | section = sa.Column(sa.Unicode) | |||
|
35 | config = sa.Column(JSON, nullable=False) | |||
|
36 | resource_id = sa.Column(sa.Integer(), | |||
|
37 | sa.ForeignKey('resources.resource_id', | |||
|
38 | onupdate='cascade', | |||
|
39 | ondelete='cascade')) | |||
|
40 | owner_id = sa.Column(sa.Integer(), | |||
|
41 | sa.ForeignKey('users.id', onupdate='cascade', | |||
|
42 | ondelete='cascade')) | |||
|
43 | ||||
|
44 | def __json__(self, request): | |||
|
45 | return self.get_dict() |
@@ -0,0 +1,489 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from datetime import datetime | |||
|
23 | import math | |||
|
24 | import uuid | |||
|
25 | import hashlib | |||
|
26 | import copy | |||
|
27 | import urllib.parse | |||
|
28 | import logging | |||
|
29 | import sqlalchemy as sa | |||
|
30 | ||||
|
31 | from appenlight.models import Base, Datastores | |||
|
32 | from appenlight.lib.utils.date_utils import convert_date | |||
|
33 | from appenlight.lib.utils import convert_es_type | |||
|
34 | from appenlight.models.slow_call import SlowCall | |||
|
35 | from appenlight.lib.utils import cometd_request | |||
|
36 | from appenlight.lib.enums import ReportType, Language | |||
|
37 | from pyramid.threadlocal import get_current_registry, get_current_request | |||
|
38 | from sqlalchemy.dialects.postgresql import JSON | |||
|
39 | from ziggurat_foundations.models.base import BaseModel | |||
|
40 | ||||
|
41 | log = logging.getLogger(__name__) | |||
|
42 | ||||
|
43 | REPORT_TYPE_MATRIX = { | |||
|
44 | 'http_status': {"type": 'int', | |||
|
45 | "ops": ('eq', 'ne', 'ge', 'le',)}, | |||
|
46 | 'group:priority': {"type": 'int', | |||
|
47 | "ops": ('eq', 'ne', 'ge', 'le',)}, | |||
|
48 | 'duration': {"type": 'float', | |||
|
49 | "ops": ('ge', 'le',)}, | |||
|
50 | 'url_domain': {"type": 'unicode', | |||
|
51 | "ops": ('eq', 'ne', 'startswith', 'endswith', 'contains',)}, | |||
|
52 | 'url_path': {"type": 'unicode', | |||
|
53 | "ops": ('eq', 'ne', 'startswith', 'endswith', 'contains',)}, | |||
|
54 | 'error': {"type": 'unicode', | |||
|
55 | "ops": ('eq', 'ne', 'startswith', 'endswith', 'contains',)}, | |||
|
56 | 'tags:server_name': {"type": 'unicode', | |||
|
57 | "ops": ('eq', 'ne', 'startswith', 'endswith', | |||
|
58 | 'contains',)}, | |||
|
59 | 'traceback': {"type": 'unicode', | |||
|
60 | "ops": ('contains',)}, | |||
|
61 | 'group:occurences': {"type": 'int', | |||
|
62 | "ops": ('eq', 'ne', 'ge', 'le',)} | |||
|
63 | } | |||
|
64 | ||||
|
65 | ||||
|
66 | class Report(Base, BaseModel): | |||
|
67 | __tablename__ = 'reports' | |||
|
68 | __table_args__ = {'implicit_returning': False} | |||
|
69 | ||||
|
70 | id = sa.Column(sa.Integer, nullable=False, primary_key=True) | |||
|
71 | group_id = sa.Column(sa.BigInteger, | |||
|
72 | sa.ForeignKey('reports_groups.id', ondelete='cascade', | |||
|
73 | onupdate='cascade')) | |||
|
74 | resource_id = sa.Column(sa.Integer(), nullable=False, index=True) | |||
|
75 | report_type = sa.Column(sa.Integer(), nullable=False, index=True) | |||
|
76 | error = sa.Column(sa.UnicodeText(), index=True) | |||
|
77 | extra = sa.Column(JSON(), default={}) | |||
|
78 | request = sa.Column(JSON(), nullable=False, default={}) | |||
|
79 | ip = sa.Column(sa.String(39), index=True, default='') | |||
|
80 | username = sa.Column(sa.Unicode(255), default='') | |||
|
81 | user_agent = sa.Column(sa.Unicode(255), default='') | |||
|
82 | url = sa.Column(sa.UnicodeText(), index=True) | |||
|
83 | request_id = sa.Column(sa.Text()) | |||
|
84 | request_stats = sa.Column(JSON(), nullable=False, default={}) | |||
|
85 | traceback = sa.Column(JSON(), nullable=False, default=None) | |||
|
86 | traceback_hash = sa.Column(sa.Text()) | |||
|
87 | start_time = sa.Column(sa.DateTime(), default=datetime.utcnow, | |||
|
88 | server_default=sa.func.now()) | |||
|
89 | end_time = sa.Column(sa.DateTime()) | |||
|
90 | duration = sa.Column(sa.Float, default=0) | |||
|
91 | http_status = sa.Column(sa.Integer, index=True) | |||
|
92 | url_domain = sa.Column(sa.Unicode(100), index=True) | |||
|
93 | url_path = sa.Column(sa.Unicode(255), index=True) | |||
|
94 | tags = sa.Column(JSON(), nullable=False, default={}) | |||
|
95 | language = sa.Column(sa.Integer(), default=0) | |||
|
96 | # this is used to determine partition for the report | |||
|
97 | report_group_time = sa.Column(sa.DateTime(), default=datetime.utcnow, | |||
|
98 | server_default=sa.func.now()) | |||
|
99 | ||||
|
100 | logs = sa.orm.relationship( | |||
|
101 | 'Log', | |||
|
102 | lazy='dynamic', | |||
|
103 | passive_deletes=True, | |||
|
104 | passive_updates=True, | |||
|
105 | primaryjoin="and_(Report.request_id==Log.request_id, " | |||
|
106 | "Log.request_id != None, Log.request_id != '')", | |||
|
107 | foreign_keys='[Log.request_id]') | |||
|
108 | ||||
|
109 | slow_calls = sa.orm.relationship('SlowCall', | |||
|
110 | backref='detail', | |||
|
111 | cascade="all, delete-orphan", | |||
|
112 | passive_deletes=True, | |||
|
113 | passive_updates=True, | |||
|
114 | order_by='SlowCall.timestamp') | |||
|
115 | ||||
|
116 | def set_data(self, data, resource, protocol_version=None): | |||
|
117 | self.http_status = data['http_status'] | |||
|
118 | self.priority = data['priority'] | |||
|
119 | self.error = data['error'] | |||
|
120 | report_language = data.get('language', '').lower() | |||
|
121 | self.language = getattr(Language, report_language, Language.unknown) | |||
|
122 | # we need temp holder here to decide later | |||
|
123 | # if we want to to commit the tags if report is marked for creation | |||
|
124 | self.tags = { | |||
|
125 | 'server_name': data['server'], | |||
|
126 | 'view_name': data['view_name'] | |||
|
127 | } | |||
|
128 | if data.get('tags'): | |||
|
129 | for tag_tuple in data['tags']: | |||
|
130 | self.tags[tag_tuple[0]] = tag_tuple[1] | |||
|
131 | self.traceback = data['traceback'] | |||
|
132 | stripped_traceback = self.stripped_traceback() | |||
|
133 | tb_repr = repr(stripped_traceback).encode('utf8') | |||
|
134 | self.traceback_hash = hashlib.sha1(tb_repr).hexdigest() | |||
|
135 | url_info = urllib.parse.urlsplit( | |||
|
136 | data.get('url', ''), allow_fragments=False) | |||
|
137 | self.url_domain = url_info.netloc[:128] | |||
|
138 | self.url_path = url_info.path[:2048] | |||
|
139 | self.occurences = data['occurences'] | |||
|
140 | if self.error: | |||
|
141 | self.report_type = ReportType.error | |||
|
142 | else: | |||
|
143 | self.report_type = ReportType.slow | |||
|
144 | ||||
|
145 | # but if its status 404 its 404 type | |||
|
146 | if self.http_status in [404, '404'] or self.error == '404 Not Found': | |||
|
147 | self.report_type = ReportType.not_found | |||
|
148 | self.error = '' | |||
|
149 | ||||
|
150 | self.generate_grouping_hash(data.get('appenlight.group_string', | |||
|
151 | data.get('group_string')), | |||
|
152 | resource.default_grouping, | |||
|
153 | protocol_version) | |||
|
154 | ||||
|
155 | # details | |||
|
156 | if data['http_status'] in [404, '404']: | |||
|
157 | data = {"username": data["username"], | |||
|
158 | "ip": data["ip"], | |||
|
159 | "url": data["url"], | |||
|
160 | "user_agent": data["user_agent"]} | |||
|
161 | if data.get('HTTP_REFERER') or data.get('http_referer'): | |||
|
162 | data['HTTP_REFERER'] = data.get( | |||
|
163 | 'HTTP_REFERER', '') or data.get('http_referer', '') | |||
|
164 | ||||
|
165 | self.resource_id = resource.resource_id | |||
|
166 | self.username = data['username'] | |||
|
167 | self.user_agent = data['user_agent'] | |||
|
168 | self.ip = data['ip'] | |||
|
169 | self.extra = {} | |||
|
170 | if data.get('extra'): | |||
|
171 | for extra_tuple in data['extra']: | |||
|
172 | self.extra[extra_tuple[0]] = extra_tuple[1] | |||
|
173 | ||||
|
174 | self.url = data['url'] | |||
|
175 | self.request_id = data.get('request_id', '').replace('-', '') or str( | |||
|
176 | uuid.uuid4()) | |||
|
177 | request_data = data.get('request', {}) | |||
|
178 | ||||
|
179 | self.request = request_data | |||
|
180 | self.request_stats = data.get('request_stats', {}) | |||
|
181 | traceback = data.get('traceback') | |||
|
182 | if not traceback: | |||
|
183 | traceback = data.get('frameinfo') | |||
|
184 | self.traceback = traceback | |||
|
185 | start_date = convert_date(data.get('start_time')) | |||
|
186 | if not self.start_time or self.start_time < start_date: | |||
|
187 | self.start_time = start_date | |||
|
188 | ||||
|
189 | self.end_time = convert_date(data.get('end_time'), False) | |||
|
190 | self.duration = 0 | |||
|
191 | ||||
|
192 | if self.start_time and self.end_time: | |||
|
193 | d = self.end_time - self.start_time | |||
|
194 | self.duration = d.total_seconds() | |||
|
195 | ||||
|
196 | # update tags with other vars | |||
|
197 | if self.username: | |||
|
198 | self.tags['user_name'] = self.username | |||
|
199 | self.tags['report_language'] = Language.key_from_value(self.language) | |||
|
200 | ||||
|
201 | def add_slow_calls(self, data, report_group): | |||
|
202 | slow_calls = [] | |||
|
203 | for call in data.get('slow_calls', []): | |||
|
204 | sc_inst = SlowCall() | |||
|
205 | sc_inst.set_data(call, resource_id=self.resource_id, | |||
|
206 | report_group=report_group) | |||
|
207 | slow_calls.append(sc_inst) | |||
|
208 | self.slow_calls.extend(slow_calls) | |||
|
209 | return slow_calls | |||
|
210 | ||||
|
211 | def get_dict(self, request, details=False, exclude_keys=None, | |||
|
212 | include_keys=None): | |||
|
213 | from appenlight.models.services.report_group import ReportGroupService | |||
|
214 | instance_dict = super(Report, self).get_dict() | |||
|
215 | instance_dict['req_stats'] = self.req_stats() | |||
|
216 | instance_dict['group'] = {} | |||
|
217 | instance_dict['group']['id'] = self.report_group.id | |||
|
218 | instance_dict['group'][ | |||
|
219 | 'total_reports'] = self.report_group.total_reports | |||
|
220 | instance_dict['group']['last_report'] = self.report_group.last_report | |||
|
221 | instance_dict['group']['priority'] = self.report_group.priority | |||
|
222 | instance_dict['group']['occurences'] = self.report_group.occurences | |||
|
223 | instance_dict['group'][ | |||
|
224 | 'last_timestamp'] = self.report_group.last_timestamp | |||
|
225 | instance_dict['group'][ | |||
|
226 | 'first_timestamp'] = self.report_group.first_timestamp | |||
|
227 | instance_dict['group']['public'] = self.report_group.public | |||
|
228 | instance_dict['group']['fixed'] = self.report_group.fixed | |||
|
229 | instance_dict['group']['read'] = self.report_group.read | |||
|
230 | instance_dict['group'][ | |||
|
231 | 'average_duration'] = self.report_group.average_duration | |||
|
232 | ||||
|
233 | instance_dict[ | |||
|
234 | 'resource_name'] = self.report_group.application.resource_name | |||
|
235 | instance_dict['report_type'] = self.report_type | |||
|
236 | ||||
|
237 | if instance_dict['http_status'] == 404 and not instance_dict['error']: | |||
|
238 | instance_dict['error'] = '404 Not Found' | |||
|
239 | ||||
|
240 | if details: | |||
|
241 | instance_dict['affected_users_count'] = \ | |||
|
242 | ReportGroupService.affected_users_count(self.report_group) | |||
|
243 | instance_dict['top_affected_users'] = [ | |||
|
244 | {'username': u.username, 'count': u.count} for u in | |||
|
245 | ReportGroupService.top_affected_users(self.report_group)] | |||
|
246 | instance_dict['application'] = {'integrations': []} | |||
|
247 | for integration in self.report_group.application.integrations: | |||
|
248 | if integration.front_visible: | |||
|
249 | instance_dict['application']['integrations'].append( | |||
|
250 | {'name': integration.integration_name, | |||
|
251 | 'action': integration.integration_action}) | |||
|
252 | instance_dict['comments'] = [c.get_dict() for c in | |||
|
253 | self.report_group.comments] | |||
|
254 | ||||
|
255 | instance_dict['group']['next_report'] = None | |||
|
256 | instance_dict['group']['previous_report'] = None | |||
|
257 | next_in_group = self.get_next_in_group() | |||
|
258 | previous_in_group = self.get_previous_in_group() | |||
|
259 | if next_in_group: | |||
|
260 | instance_dict['group']['next_report'] = next_in_group.id | |||
|
261 | if previous_in_group: | |||
|
262 | instance_dict['group']['previous_report'] = \ | |||
|
263 | previous_in_group.id | |||
|
264 | ||||
|
265 | # slow call ordering | |||
|
266 | def find_parent(row, data): | |||
|
267 | for r in reversed(data): | |||
|
268 | try: | |||
|
269 | if (row['timestamp'] > r['timestamp'] and | |||
|
270 | row['end_time'] < r['end_time']): | |||
|
271 | return r | |||
|
272 | except TypeError as e: | |||
|
273 | log.warning('reports_view.find_parent: %s' % e) | |||
|
274 | return None | |||
|
275 | ||||
|
276 | new_calls = [] | |||
|
277 | calls = [c.get_dict() for c in self.slow_calls] | |||
|
278 | while calls: | |||
|
279 | # start from end | |||
|
280 | for x in range(len(calls) - 1, -1, -1): | |||
|
281 | parent = find_parent(calls[x], calls) | |||
|
282 | if parent: | |||
|
283 | parent['children'].append(calls[x]) | |||
|
284 | else: | |||
|
285 | # no parent at all? append to new calls anyways | |||
|
286 | new_calls.append(calls[x]) | |||
|
287 | # print 'append', calls[x] | |||
|
288 | del calls[x] | |||
|
289 | break | |||
|
290 | instance_dict['slow_calls'] = new_calls | |||
|
291 | ||||
|
292 | instance_dict['front_url'] = self.get_public_url(request) | |||
|
293 | ||||
|
294 | exclude_keys_list = exclude_keys or [] | |||
|
295 | include_keys_list = include_keys or [] | |||
|
296 | for k in list(instance_dict.keys()): | |||
|
297 | if k == 'group': | |||
|
298 | continue | |||
|
299 | if (k in exclude_keys_list or | |||
|
300 | (k not in include_keys_list and include_keys)): | |||
|
301 | del instance_dict[k] | |||
|
302 | return instance_dict | |||
|
303 | ||||
|
304 | def get_previous_in_group(self): | |||
|
305 | start_day = self.report_group_time.date().replace(day=1) | |||
|
306 | end_day = start_day.replace(month=start_day.month+1) | |||
|
307 | query = self.report_group.reports.filter(Report.id < self.id) | |||
|
308 | query = query.filter(Report.report_group_time.between( | |||
|
309 | start_day, end_day)) | |||
|
310 | return query.order_by(sa.desc(Report.id)).first() | |||
|
311 | ||||
|
312 | def get_next_in_group(self): | |||
|
313 | start_day = self.report_group_time.date().replace(day=1) | |||
|
314 | end_day = start_day.replace(month=start_day.month+1) | |||
|
315 | query = self.report_group.reports.filter(Report.id > self.id) | |||
|
316 | query = query.filter(Report.report_group_time.between( | |||
|
317 | start_day, end_day)) | |||
|
318 | return query.order_by(sa.asc(Report.id)).first() | |||
|
319 | ||||
|
320 | def get_public_url(self, request=None, report_group=None, _app_url=None): | |||
|
321 | """ | |||
|
322 | Returns url that user can use to visit specific report | |||
|
323 | """ | |||
|
324 | if not request: | |||
|
325 | request = get_current_request() | |||
|
326 | url = request.route_url('/', _app_url=_app_url) | |||
|
327 | if report_group: | |||
|
328 | return (url + 'ui/report/%s/%s') % (report_group.id, self.id) | |||
|
329 | return (url + 'ui/report/%s/%s') % (self.group_id, self.id) | |||
|
330 | ||||
|
331 | def req_stats(self): | |||
|
332 | stats = self.request_stats.copy() | |||
|
333 | stats['percentages'] = {} | |||
|
334 | stats['percentages']['main'] = 100.0 | |||
|
335 | main = stats.get('main', 0.0) | |||
|
336 | if not main: | |||
|
337 | return None | |||
|
338 | for name, call_time in stats.items(): | |||
|
339 | if ('calls' not in name and 'main' not in name and | |||
|
340 | 'percentages' not in name): | |||
|
341 | stats['main'] -= call_time | |||
|
342 | stats['percentages'][name] = math.floor( | |||
|
343 | (call_time / main * 100.0)) | |||
|
344 | stats['percentages']['main'] -= stats['percentages'][name] | |||
|
345 | if stats['percentages']['main'] < 0.0: | |||
|
346 | stats['percentages']['main'] = 0.0 | |||
|
347 | stats['main'] = 0.0 | |||
|
348 | return stats | |||
|
349 | ||||
|
350 | def generate_grouping_hash(self, hash_string=None, default_grouping=None, | |||
|
351 | protocol_version=None): | |||
|
352 | """ | |||
|
353 | Generates SHA1 hash that will be used to group reports together | |||
|
354 | """ | |||
|
355 | if not hash_string: | |||
|
356 | location = self.tags.get('view_name') or self.url_path; | |||
|
357 | server_name = self.tags.get('server_name') or '' | |||
|
358 | if default_grouping == 'url_traceback': | |||
|
359 | hash_string = '%s_%s_%s' % (self.traceback_hash, location, | |||
|
360 | self.error) | |||
|
361 | if self.language == Language.javascript: | |||
|
362 | hash_string = '%s_%s' % (self.traceback_hash, self.error) | |||
|
363 | ||||
|
364 | elif default_grouping == 'traceback_server': | |||
|
365 | hash_string = '%s_%s' % (self.traceback_hash, server_name) | |||
|
366 | if self.language == Language.javascript: | |||
|
367 | hash_string = '%s_%s' % (self.traceback_hash, server_name) | |||
|
368 | else: | |||
|
369 | hash_string = '%s_%s' % (self.error, location) | |||
|
370 | binary_string = hash_string.encode('utf8') | |||
|
371 | self.grouping_hash = hashlib.sha1(binary_string).hexdigest() | |||
|
372 | return self.grouping_hash | |||
|
373 | ||||
|
374 | def stripped_traceback(self): | |||
|
375 | """ | |||
|
376 | Traceback without local vars | |||
|
377 | """ | |||
|
378 | stripped_traceback = copy.deepcopy(self.traceback) | |||
|
379 | ||||
|
380 | if isinstance(stripped_traceback, list): | |||
|
381 | for row in stripped_traceback: | |||
|
382 | row.pop('vars', None) | |||
|
383 | return stripped_traceback | |||
|
384 | ||||
|
385 | def notify_channel(self, report_group): | |||
|
386 | """ | |||
|
387 | Sends notification to websocket channel | |||
|
388 | """ | |||
|
389 | settings = get_current_registry().settings | |||
|
390 | log.info('notify cometd') | |||
|
391 | if self.report_type != ReportType.error: | |||
|
392 | return | |||
|
393 | payload = { | |||
|
394 | 'type': 'message', | |||
|
395 | "user": '__system__', | |||
|
396 | "channel": 'app_%s' % self.resource_id, | |||
|
397 | 'message': { | |||
|
398 | 'type': 'report', | |||
|
399 | 'report': { | |||
|
400 | 'group': { | |||
|
401 | 'priority': report_group.priority, | |||
|
402 | 'first_timestamp': report_group.first_timestamp, | |||
|
403 | 'last_timestamp': report_group.last_timestamp, | |||
|
404 | 'average_duration': report_group.average_duration, | |||
|
405 | 'occurences': report_group.occurences | |||
|
406 | }, | |||
|
407 | 'report_id': self.id, | |||
|
408 | 'group_id': self.group_id, | |||
|
409 | 'resource_id': self.resource_id, | |||
|
410 | 'http_status': self.http_status, | |||
|
411 | 'url_domain': self.url_domain, | |||
|
412 | 'url_path': self.url_path, | |||
|
413 | 'error': self.error or '', | |||
|
414 | 'server': self.tags.get('server_name'), | |||
|
415 | 'view_name': self.tags.get('view_name'), | |||
|
416 | 'front_url': self.get_public_url(), | |||
|
417 | } | |||
|
418 | } | |||
|
419 | ||||
|
420 | } | |||
|
421 | ||||
|
422 | cometd_request(settings['cometd.secret'], '/message', [payload], | |||
|
423 | servers=[settings['cometd_servers']]) | |||
|
424 | ||||
|
425 | def es_doc(self): | |||
|
426 | tags = {} | |||
|
427 | tag_list = [] | |||
|
428 | for name, value in self.tags.items(): | |||
|
429 | name = name.replace('.', '_') | |||
|
430 | tag_list.append(name) | |||
|
431 | tags[name] = { | |||
|
432 | "values": convert_es_type(value), | |||
|
433 | "numeric_values": value if ( | |||
|
434 | isinstance(value, (int, float)) and | |||
|
435 | not isinstance(value, bool)) else None} | |||
|
436 | ||||
|
437 | if 'user_name' not in self.tags and self.username: | |||
|
438 | tags["user_name"] = {"value": [self.username], | |||
|
439 | "numeric_value": None} | |||
|
440 | return { | |||
|
441 | '_id': str(self.id), | |||
|
442 | 'pg_id': str(self.id), | |||
|
443 | 'resource_id': self.resource_id, | |||
|
444 | 'http_status': self.http_status or '', | |||
|
445 | 'start_time': self.start_time, | |||
|
446 | 'end_time': self.end_time, | |||
|
447 | 'url_domain': self.url_domain if self.url_domain else '', | |||
|
448 | 'url_path': self.url_path if self.url_path else '', | |||
|
449 | 'duration': self.duration, | |||
|
450 | 'error': self.error if self.error else '', | |||
|
451 | 'report_type': self.report_type, | |||
|
452 | 'request_id': self.request_id, | |||
|
453 | 'ip': self.ip, | |||
|
454 | 'group_id': str(self.group_id), | |||
|
455 | '_parent': str(self.group_id), | |||
|
456 | 'tags': tags, | |||
|
457 | 'tag_list': tag_list | |||
|
458 | } | |||
|
459 | ||||
|
460 | @property | |||
|
461 | def partition_id(self): | |||
|
462 | return 'rcae_r_%s' % self.report_group_time.strftime('%Y_%m') | |||
|
463 | ||||
|
464 | ||||
|
465 | def after_insert(mapper, connection, target): | |||
|
466 | if not hasattr(target, '_skip_ft_index'): | |||
|
467 | data = target.es_doc() | |||
|
468 | data.pop('_id', None) | |||
|
469 | Datastores.es.index(target.partition_id, 'report', data, | |||
|
470 | parent=target.group_id, id=target.id) | |||
|
471 | ||||
|
472 | ||||
|
473 | def after_update(mapper, connection, target): | |||
|
474 | if not hasattr(target, '_skip_ft_index'): | |||
|
475 | data = target.es_doc() | |||
|
476 | data.pop('_id', None) | |||
|
477 | Datastores.es.index(target.partition_id, 'report', data, | |||
|
478 | parent=target.group_id, id=target.id) | |||
|
479 | ||||
|
480 | ||||
|
481 | def after_delete(mapper, connection, target): | |||
|
482 | if not hasattr(target, '_skip_ft_index'): | |||
|
483 | query = {'term': {'pg_id': target.id}} | |||
|
484 | Datastores.es.delete_by_query(target.partition_id, 'report', query) | |||
|
485 | ||||
|
486 | ||||
|
487 | sa.event.listen(Report, 'after_insert', after_insert) | |||
|
488 | sa.event.listen(Report, 'after_update', after_update) | |||
|
489 | sa.event.listen(Report, 'after_delete', after_delete) |
@@ -0,0 +1,37 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from ziggurat_foundations.models.base import BaseModel | |||
|
23 | from appenlight.models import Base | |||
|
24 | import sqlalchemy as sa | |||
|
25 | ||||
|
26 | ||||
|
27 | class ReportAssignment(Base, BaseModel): | |||
|
28 | __tablename__ = 'reports_assignments' | |||
|
29 | ||||
|
30 | group_id = sa.Column(sa.BigInteger, | |||
|
31 | sa.ForeignKey('reports_groups.id', ondelete='cascade', | |||
|
32 | onupdate='cascade'), | |||
|
33 | primary_key=True) | |||
|
34 | owner_id = sa.Column(sa.Integer, | |||
|
35 | sa.ForeignKey('users.id', onupdate='CASCADE', | |||
|
36 | ondelete='CASCADE'), primary_key=True) | |||
|
37 | report_time = sa.Column(sa.DateTime(), nullable=False) |
@@ -0,0 +1,55 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import sqlalchemy as sa | |||
|
23 | ||||
|
24 | from datetime import datetime | |||
|
25 | from appenlight.models import Base | |||
|
26 | from ziggurat_foundations.models.base import BaseModel | |||
|
27 | ||||
|
28 | ||||
|
29 | class ReportComment(Base, BaseModel): | |||
|
30 | __tablename__ = 'reports_comments' | |||
|
31 | ||||
|
32 | comment_id = sa.Column(sa.Integer, nullable=False, primary_key=True) | |||
|
33 | group_id = sa.Column(sa.BigInteger, | |||
|
34 | sa.ForeignKey('reports_groups.id', ondelete='cascade', | |||
|
35 | onupdate='cascade')) | |||
|
36 | body = sa.Column(sa.UnicodeText(), default='') | |||
|
37 | owner_id = sa.Column(sa.Integer, | |||
|
38 | sa.ForeignKey('users.id', onupdate='CASCADE', | |||
|
39 | ondelete='CASCADE')) | |||
|
40 | created_timestamp = sa.Column(sa.DateTime(), | |||
|
41 | default=datetime.utcnow, | |||
|
42 | server_default=sa.func.now()) | |||
|
43 | report_time = sa.Column(sa.DateTime(), nullable=False) | |||
|
44 | ||||
|
45 | owner = sa.orm.relationship('User', | |||
|
46 | lazy='joined') | |||
|
47 | ||||
|
48 | @property | |||
|
49 | def processed_body(self): | |||
|
50 | return self.body | |||
|
51 | ||||
|
52 | def get_dict(self): | |||
|
53 | instance_dict = super(ReportComment, self).get_dict() | |||
|
54 | instance_dict['user_name'] = self.owner.user_name | |||
|
55 | return instance_dict |
@@ -0,0 +1,251 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | import sqlalchemy as sa | |||
|
24 | ||||
|
25 | from datetime import datetime | |||
|
26 | ||||
|
27 | from pyramid.threadlocal import get_current_request | |||
|
28 | from sqlalchemy.dialects.postgresql import JSON | |||
|
29 | from ziggurat_foundations.models.base import BaseModel | |||
|
30 | ||||
|
31 | from appenlight.models import Base, get_db_session, Datastores | |||
|
32 | from appenlight.lib.enums import ReportType | |||
|
33 | from appenlight.lib.rule import Rule | |||
|
34 | from appenlight.lib.redis_keys import REDIS_KEYS | |||
|
35 | from appenlight.models.report import REPORT_TYPE_MATRIX | |||
|
36 | ||||
|
37 | log = logging.getLogger(__name__) | |||
|
38 | ||||
|
39 | ||||
|
40 | class ReportGroup(Base, BaseModel): | |||
|
41 | __tablename__ = 'reports_groups' | |||
|
42 | __table_args__ = {'implicit_returning': False} | |||
|
43 | ||||
|
44 | id = sa.Column(sa.BigInteger(), nullable=False, primary_key=True) | |||
|
45 | resource_id = sa.Column(sa.Integer(), | |||
|
46 | sa.ForeignKey('applications.resource_id', | |||
|
47 | onupdate='CASCADE', | |||
|
48 | ondelete='CASCADE'), | |||
|
49 | nullable=False, | |||
|
50 | index=True) | |||
|
51 | priority = sa.Column(sa.Integer, nullable=False, index=True, default=5, | |||
|
52 | server_default='5') | |||
|
53 | first_timestamp = sa.Column(sa.DateTime(), default=datetime.utcnow, | |||
|
54 | server_default=sa.func.now()) | |||
|
55 | last_timestamp = sa.Column(sa.DateTime(), default=datetime.utcnow, | |||
|
56 | server_default=sa.func.now()) | |||
|
57 | error = sa.Column(sa.UnicodeText(), index=True) | |||
|
58 | grouping_hash = sa.Column(sa.String(40), default='') | |||
|
59 | triggered_postprocesses_ids = sa.Column(JSON(), nullable=False, | |||
|
60 | default=list) | |||
|
61 | report_type = sa.Column(sa.Integer, default=1) | |||
|
62 | total_reports = sa.Column(sa.Integer, default=1) | |||
|
63 | last_report = sa.Column(sa.Integer) | |||
|
64 | occurences = sa.Column(sa.Integer, default=1) | |||
|
65 | average_duration = sa.Column(sa.Float, default=0) | |||
|
66 | summed_duration = sa.Column(sa.Float, default=0) | |||
|
67 | read = sa.Column(sa.Boolean(), index=True, default=False) | |||
|
68 | fixed = sa.Column(sa.Boolean(), index=True, default=False) | |||
|
69 | notified = sa.Column(sa.Boolean(), index=True, default=False) | |||
|
70 | public = sa.Column(sa.Boolean(), index=True, default=False) | |||
|
71 | ||||
|
72 | reports = sa.orm.relationship('Report', | |||
|
73 | lazy='dynamic', | |||
|
74 | backref='report_group', | |||
|
75 | cascade="all, delete-orphan", | |||
|
76 | passive_deletes=True, | |||
|
77 | passive_updates=True, ) | |||
|
78 | ||||
|
79 | comments = sa.orm.relationship('ReportComment', | |||
|
80 | lazy='dynamic', | |||
|
81 | backref='report', | |||
|
82 | cascade="all, delete-orphan", | |||
|
83 | passive_deletes=True, | |||
|
84 | passive_updates=True, | |||
|
85 | order_by="ReportComment.comment_id") | |||
|
86 | ||||
|
87 | assigned_users = sa.orm.relationship('User', | |||
|
88 | backref=sa.orm.backref( | |||
|
89 | 'assigned_reports_relation', | |||
|
90 | lazy='dynamic', | |||
|
91 | order_by=sa.desc( | |||
|
92 | "reports_groups.id") | |||
|
93 | ), | |||
|
94 | passive_deletes=True, | |||
|
95 | passive_updates=True, | |||
|
96 | secondary='reports_assignments', | |||
|
97 | order_by="User.user_name") | |||
|
98 | ||||
|
99 | stats = sa.orm.relationship('ReportStat', | |||
|
100 | lazy='dynamic', | |||
|
101 | backref='report', | |||
|
102 | passive_deletes=True, | |||
|
103 | passive_updates=True, ) | |||
|
104 | ||||
|
105 | last_report_ref = sa.orm.relationship('Report', | |||
|
106 | uselist=False, | |||
|
107 | primaryjoin="ReportGroup.last_report " | |||
|
108 | "== Report.id", | |||
|
109 | foreign_keys="Report.id", | |||
|
110 | cascade="all, delete-orphan", | |||
|
111 | passive_deletes=True, | |||
|
112 | passive_updates=True, ) | |||
|
113 | ||||
|
114 | def __repr__(self): | |||
|
115 | return '<ReportGroup id:{}>'.format(self.id) | |||
|
116 | ||||
|
117 | def get_report(self, report_id=None, public=False): | |||
|
118 | """ | |||
|
119 | Gets report with specific id or latest report if id was not specified | |||
|
120 | """ | |||
|
121 | from .report import Report | |||
|
122 | ||||
|
123 | if not report_id: | |||
|
124 | return self.last_report_ref | |||
|
125 | else: | |||
|
126 | return self.reports.filter(Report.id == report_id).first() | |||
|
127 | ||||
|
128 | def get_public_url(self, request, _app_url=None): | |||
|
129 | url = request.route_url('/', _app_url=_app_url) | |||
|
130 | return (url + 'ui/report/%s') % self.id | |||
|
131 | ||||
|
132 | def run_postprocessing(self, report): | |||
|
133 | """ | |||
|
134 | Alters report group priority based on postprocessing configuration | |||
|
135 | """ | |||
|
136 | request = get_current_request() | |||
|
137 | get_db_session(None, self).flush() | |||
|
138 | for action in self.application.postprocess_conf: | |||
|
139 | get_db_session(None, self).flush() | |||
|
140 | rule_obj = Rule(action.rule, REPORT_TYPE_MATRIX) | |||
|
141 | report_dict = report.get_dict(request) | |||
|
142 | # if was not processed yet | |||
|
143 | if (rule_obj.match(report_dict) and | |||
|
144 | action.pkey not in self.triggered_postprocesses_ids): | |||
|
145 | action.postprocess(self) | |||
|
146 | # this way sqla can track mutation of list | |||
|
147 | self.triggered_postprocesses_ids = \ | |||
|
148 | self.triggered_postprocesses_ids + [action.pkey] | |||
|
149 | ||||
|
150 | get_db_session(None, self).flush() | |||
|
151 | # do not go out of bounds | |||
|
152 | if self.priority < 1: | |||
|
153 | self.priority = 1 | |||
|
154 | if self.priority > 10: | |||
|
155 | self.priority = 10 | |||
|
156 | ||||
|
157 | def get_dict(self, request): | |||
|
158 | instance_dict = super(ReportGroup, self).get_dict() | |||
|
159 | instance_dict['server_name'] = self.get_report().tags.get( | |||
|
160 | 'server_name') | |||
|
161 | instance_dict['view_name'] = self.get_report().tags.get('view_name') | |||
|
162 | instance_dict['resource_name'] = self.application.resource_name | |||
|
163 | instance_dict['report_type'] = self.get_report().report_type | |||
|
164 | instance_dict['url_path'] = self.get_report().url_path | |||
|
165 | instance_dict['front_url'] = self.get_report().get_public_url(request) | |||
|
166 | del instance_dict['triggered_postprocesses_ids'] | |||
|
167 | return instance_dict | |||
|
168 | ||||
|
169 | def es_doc(self): | |||
|
170 | return { | |||
|
171 | '_id': str(self.id), | |||
|
172 | 'pg_id': str(self.id), | |||
|
173 | 'resource_id': self.resource_id, | |||
|
174 | 'error': self.error, | |||
|
175 | 'fixed': self.fixed, | |||
|
176 | 'public': self.public, | |||
|
177 | 'read': self.read, | |||
|
178 | 'priority': self.priority, | |||
|
179 | 'occurences': self.occurences, | |||
|
180 | 'average_duration': self.average_duration, | |||
|
181 | 'summed_duration': self.summed_duration, | |||
|
182 | 'first_timestamp': self.first_timestamp, | |||
|
183 | 'last_timestamp': self.last_timestamp | |||
|
184 | } | |||
|
185 | ||||
|
186 | def set_notification_info(self, notify_10=False, notify_100=False): | |||
|
187 | """ | |||
|
188 | Update redis notification maps for notification job | |||
|
189 | """ | |||
|
190 | current_time = datetime.utcnow().replace(second=0, microsecond=0) | |||
|
191 | # global app counter | |||
|
192 | key = REDIS_KEYS['counters']['reports_per_type'].format( | |||
|
193 | self.report_type, current_time) | |||
|
194 | Datastores.redis.incr(key) | |||
|
195 | Datastores.redis.expire(key, 3600 * 24) | |||
|
196 | # detailed app notification | |||
|
197 | Datastores.redis.sadd(REDIS_KEYS['apps_that_had_reports'], | |||
|
198 | self.resource_id) | |||
|
199 | # only notify for exceptions here | |||
|
200 | if self.report_type == ReportType.error: | |||
|
201 | Datastores.redis.sadd(REDIS_KEYS['apps_that_had_reports'], | |||
|
202 | self.resource_id) | |||
|
203 | key = REDIS_KEYS['counters']['report_group_occurences'].format(self.id) | |||
|
204 | Datastores.redis.incr(key) | |||
|
205 | Datastores.redis.expire(key, 3600 * 24) | |||
|
206 | ||||
|
207 | if notify_10: | |||
|
208 | key = REDIS_KEYS['counters'][ | |||
|
209 | 'report_group_occurences_10th'].format(self.id) | |||
|
210 | Datastores.redis.setex(key, 3600 * 24, 1) | |||
|
211 | if notify_100: | |||
|
212 | key = REDIS_KEYS['counters'][ | |||
|
213 | 'report_group_occurences_100th'].format(self.id) | |||
|
214 | Datastores.redis.setex(key, 3600 * 24, 1) | |||
|
215 | ||||
|
216 | key = REDIS_KEYS['reports_to_notify_per_type_per_app'].format( | |||
|
217 | self.report_type, self.resource_id) | |||
|
218 | Datastores.redis.sadd(key, self.id) | |||
|
219 | Datastores.redis.expire(key, 3600 * 24) | |||
|
220 | ||||
|
221 | @property | |||
|
222 | def partition_id(self): | |||
|
223 | return 'rcae_r_%s' % self.first_timestamp.strftime('%Y_%m') | |||
|
224 | ||||
|
225 | ||||
|
226 | def after_insert(mapper, connection, target): | |||
|
227 | if not hasattr(target, '_skip_ft_index'): | |||
|
228 | data = target.es_doc() | |||
|
229 | data.pop('_id', None) | |||
|
230 | Datastores.es.index(target.partition_id, 'report_group', | |||
|
231 | data, id=target.id) | |||
|
232 | ||||
|
233 | ||||
|
234 | def after_update(mapper, connection, target): | |||
|
235 | if not hasattr(target, '_skip_ft_index'): | |||
|
236 | data = target.es_doc() | |||
|
237 | data.pop('_id', None) | |||
|
238 | Datastores.es.index(target.partition_id, 'report_group', | |||
|
239 | data, id=target.id) | |||
|
240 | ||||
|
241 | ||||
|
242 | def after_delete(mapper, connection, target): | |||
|
243 | query = {'term': {'group_id': target.id}} | |||
|
244 | Datastores.es.delete_by_query(target.partition_id, 'report', query) | |||
|
245 | query = {'term': {'pg_id': target.id}} | |||
|
246 | Datastores.es.delete_by_query(target.partition_id, 'report_group', query) | |||
|
247 | ||||
|
248 | ||||
|
249 | sa.event.listen(ReportGroup, 'after_insert', after_insert) | |||
|
250 | sa.event.listen(ReportGroup, 'after_update', after_update) | |||
|
251 | sa.event.listen(ReportGroup, 'after_delete', after_delete) |
@@ -0,0 +1,79 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import sqlalchemy as sa | |||
|
23 | ||||
|
24 | from appenlight.lib.enums import ReportType | |||
|
25 | from appenlight.models import Base | |||
|
26 | from ziggurat_foundations.models.base import BaseModel | |||
|
27 | ||||
|
28 | ||||
|
29 | class ReportStat(Base, BaseModel): | |||
|
30 | __tablename__ = 'reports_stats' | |||
|
31 | __table_args__ = {'implicit_returning': False} | |||
|
32 | ||||
|
33 | group_id = sa.Column(sa.BigInteger(), | |||
|
34 | sa.ForeignKey('reports_groups.id'), | |||
|
35 | nullable=False) | |||
|
36 | resource_id = sa.Column(sa.Integer(), | |||
|
37 | sa.ForeignKey('applications.resource_id'), | |||
|
38 | nullable=False) | |||
|
39 | start_interval = sa.Column(sa.DateTime(), nullable=False) | |||
|
40 | occurences = sa.Column(sa.Integer, nullable=True, default=0) | |||
|
41 | owner_user_id = sa.Column(sa.Integer(), sa.ForeignKey('users.id'), | |||
|
42 | nullable=True) | |||
|
43 | type = sa.Column(sa.Integer, nullable=True, default=0) | |||
|
44 | duration = sa.Column(sa.Float, nullable=True, default=0) | |||
|
45 | id = sa.Column(sa.BigInteger, nullable=False, primary_key=True) | |||
|
46 | server_name = sa.Column(sa.Unicode(128), nullable=False, default='') | |||
|
47 | view_name = sa.Column(sa.Unicode(128), nullable=False, default='') | |||
|
48 | ||||
|
49 | @property | |||
|
50 | def partition_id(self): | |||
|
51 | return 'rcae_r_%s' % self.start_interval.strftime('%Y_%m') | |||
|
52 | ||||
|
53 | def es_doc(self): | |||
|
54 | return { | |||
|
55 | 'resource_id': self.resource_id, | |||
|
56 | 'timestamp': self.start_interval, | |||
|
57 | 'pg_id': str(self.id), | |||
|
58 | 'permanent': True, | |||
|
59 | 'request_id': None, | |||
|
60 | 'log_level': 'ERROR', | |||
|
61 | 'message': None, | |||
|
62 | 'namespace': 'appenlight.error', | |||
|
63 | 'tags': { | |||
|
64 | 'duration': {'values': self.duration, | |||
|
65 | 'numeric_values': self.duration}, | |||
|
66 | 'occurences': {'values': self.occurences, | |||
|
67 | 'numeric_values': self.occurences}, | |||
|
68 | 'group_id': {'values': self.group_id, | |||
|
69 | 'numeric_values': self.group_id}, | |||
|
70 | 'type': {'values': ReportType.key_from_value(self.type), | |||
|
71 | 'numeric_values': self.type}, | |||
|
72 | 'server_name': {'values': self.server_name, | |||
|
73 | 'numeric_values': None}, | |||
|
74 | 'view_name': {'values': self.view_name, | |||
|
75 | 'numeric_values': None}, | |||
|
76 | }, | |||
|
77 | 'tag_list': ['duration', 'occurences', 'group_id', 'type', | |||
|
78 | 'server_name', 'view_name'] | |||
|
79 | } |
@@ -0,0 +1,69 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from datetime import datetime | |||
|
23 | ||||
|
24 | import sqlalchemy as sa | |||
|
25 | from sqlalchemy.dialects.postgresql import JSON | |||
|
26 | ||||
|
27 | from ziggurat_foundations.models.base import BaseModel | |||
|
28 | from appenlight.lib.utils import convert_es_type | |||
|
29 | from appenlight.models import Base | |||
|
30 | ||||
|
31 | ||||
|
32 | class Metric(Base, BaseModel): | |||
|
33 | __tablename__ = 'metrics' | |||
|
34 | __table_args__ = {'implicit_returning': False} | |||
|
35 | ||||
|
36 | pkey = sa.Column(sa.BigInteger(), primary_key=True) | |||
|
37 | resource_id = sa.Column(sa.Integer(), | |||
|
38 | sa.ForeignKey('applications.resource_id'), | |||
|
39 | nullable=False, primary_key=True) | |||
|
40 | timestamp = sa.Column(sa.DateTime(), default=datetime.utcnow, | |||
|
41 | server_default=sa.func.now()) | |||
|
42 | tags = sa.Column(JSON(), default={}) | |||
|
43 | namespace = sa.Column(sa.Unicode(255)) | |||
|
44 | ||||
|
45 | @property | |||
|
46 | def partition_id(self): | |||
|
47 | return 'rcae_m_%s' % self.timestamp.strftime('%Y_%m_%d') | |||
|
48 | ||||
|
49 | def es_doc(self): | |||
|
50 | tags = {} | |||
|
51 | tag_list = [] | |||
|
52 | for name, value in self.tags.items(): | |||
|
53 | # replace dot in indexed tag name | |||
|
54 | name = name.replace('.', '_') | |||
|
55 | tag_list.append(name) | |||
|
56 | tags[name] = { | |||
|
57 | "values": convert_es_type(value), | |||
|
58 | "numeric_values": value if ( | |||
|
59 | isinstance(value, (int, float)) and | |||
|
60 | not isinstance(value, bool)) else None | |||
|
61 | } | |||
|
62 | ||||
|
63 | return { | |||
|
64 | 'resource_id': self.resource_id, | |||
|
65 | 'timestamp': self.timestamp, | |||
|
66 | 'namespace': self.namespace, | |||
|
67 | 'tags': tags, | |||
|
68 | 'tag_list': tag_list | |||
|
69 | } |
@@ -0,0 +1,88 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import sqlalchemy as sa | |||
|
23 | from appenlight.models import Base | |||
|
24 | from appenlight.lib.utils import permission_tuple_to_dict | |||
|
25 | from pyramid.security import Allow, ALL_PERMISSIONS | |||
|
26 | from ziggurat_foundations.models.resource import ResourceMixin | |||
|
27 | ||||
|
28 | ||||
|
29 | class Resource(ResourceMixin, Base): | |||
|
30 | events = sa.orm.relationship('Event', | |||
|
31 | lazy='dynamic', | |||
|
32 | backref='resource', | |||
|
33 | passive_deletes=True, | |||
|
34 | passive_updates=True) | |||
|
35 | ||||
|
36 | @property | |||
|
37 | def owner_user_name(self): | |||
|
38 | if self.owner: | |||
|
39 | return self.owner.user_name | |||
|
40 | ||||
|
41 | @property | |||
|
42 | def owner_group_name(self): | |||
|
43 | if self.owner_group: | |||
|
44 | return self.owner_group.group_name | |||
|
45 | ||||
|
46 | def get_dict(self, exclude_keys=None, include_keys=None, | |||
|
47 | include_perms=False, include_processing_rules=False): | |||
|
48 | result = super(Resource, self).get_dict(exclude_keys, include_keys) | |||
|
49 | result['possible_permissions'] = self.__possible_permissions__ | |||
|
50 | if include_perms: | |||
|
51 | result['current_permissions'] = self.user_permissions_list | |||
|
52 | else: | |||
|
53 | result['current_permissions'] = [] | |||
|
54 | if include_processing_rules: | |||
|
55 | result["postprocessing_rules"] = [rule.get_dict() for rule | |||
|
56 | in self.postprocess_conf] | |||
|
57 | else: | |||
|
58 | result["postprocessing_rules"] = [] | |||
|
59 | exclude_keys_list = exclude_keys or [] | |||
|
60 | include_keys_list = include_keys or [] | |||
|
61 | d = {} | |||
|
62 | for k in result.keys(): | |||
|
63 | if (k not in exclude_keys_list and | |||
|
64 | (k in include_keys_list or not include_keys)): | |||
|
65 | d[k] = result[k] | |||
|
66 | for k in ['owner_user_name', 'owner_group_name']: | |||
|
67 | if (k not in exclude_keys_list and | |||
|
68 | (k in include_keys_list or not include_keys)): | |||
|
69 | d[k] = getattr(self, k) | |||
|
70 | return d | |||
|
71 | ||||
|
72 | @property | |||
|
73 | def user_permissions_list(self): | |||
|
74 | return [permission_tuple_to_dict(perm) for perm in | |||
|
75 | self.users_for_perm('__any_permission__', | |||
|
76 | limit_group_permissions=True)] | |||
|
77 | ||||
|
78 | @property | |||
|
79 | def __acl__(self): | |||
|
80 | acls = [] | |||
|
81 | ||||
|
82 | if self.owner_user_id: | |||
|
83 | acls.extend([(Allow, self.owner_user_id, ALL_PERMISSIONS,), ]) | |||
|
84 | ||||
|
85 | if self.owner_group_id: | |||
|
86 | acls.extend([(Allow, "group:%s" % self.owner_group_id, | |||
|
87 | ALL_PERMISSIONS,), ]) | |||
|
88 | return acls |
@@ -0,0 +1,21 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 |
@@ -0,0 +1,40 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from appenlight.models import get_db_session | |||
|
23 | from appenlight.models.alert_channel import AlertChannel | |||
|
24 | from appenlight.models.services.base import BaseService | |||
|
25 | ||||
|
26 | ||||
|
27 | class AlertChannelService(BaseService): | |||
|
28 | @classmethod | |||
|
29 | def by_owner_id_and_pkey(cls, owner_id, pkey, db_session=None): | |||
|
30 | db_session = get_db_session(db_session) | |||
|
31 | query = db_session.query(AlertChannel) | |||
|
32 | query = query.filter(AlertChannel.owner_id == owner_id) | |||
|
33 | return query.filter(AlertChannel.pkey == pkey).first() | |||
|
34 | ||||
|
35 | @classmethod | |||
|
36 | def by_integration_id(cls, integration_id, db_session=None): | |||
|
37 | db_session = get_db_session(db_session) | |||
|
38 | query = db_session.query(AlertChannel) | |||
|
39 | query = query.filter(AlertChannel.integration_id == integration_id) | |||
|
40 | return query.first() |
@@ -0,0 +1,64 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from appenlight.models import get_db_session | |||
|
23 | from appenlight.models.alert_channel_action import AlertChannelAction | |||
|
24 | from appenlight.models.services.base import BaseService | |||
|
25 | ||||
|
26 | ||||
|
27 | class AlertChannelActionService(BaseService): | |||
|
28 | @classmethod | |||
|
29 | def by_owner_id_and_pkey(cls, owner_id, pkey, db_session=None): | |||
|
30 | db_session = get_db_session(db_session) | |||
|
31 | query = db_session.query(AlertChannelAction) | |||
|
32 | query = query.filter(AlertChannelAction.owner_id == owner_id) | |||
|
33 | return query.filter(AlertChannelAction.pkey == pkey).first() | |||
|
34 | ||||
|
35 | @classmethod | |||
|
36 | def by_pkey(cls, pkey, db_session=None): | |||
|
37 | db_session = get_db_session(db_session) | |||
|
38 | query = db_session.query(AlertChannelAction) | |||
|
39 | return query.filter(AlertChannelAction.pkey == pkey).first() | |||
|
40 | ||||
|
41 | @classmethod | |||
|
42 | def by_owner_id_and_type(cls, owner_id, alert_type, db_session=None): | |||
|
43 | db_session = get_db_session(db_session) | |||
|
44 | query = db_session.query(AlertChannelAction) | |||
|
45 | query = query.filter(AlertChannelAction.owner_id == owner_id) | |||
|
46 | return query.filter(AlertChannelAction.type == alert_type).first() | |||
|
47 | ||||
|
48 | @classmethod | |||
|
49 | def by_type(cls, alert_type, db_session=None): | |||
|
50 | db_session = get_db_session(db_session) | |||
|
51 | query = db_session.query(AlertChannelAction) | |||
|
52 | return query.filter(AlertChannelAction.type == alert_type) | |||
|
53 | ||||
|
54 | @classmethod | |||
|
55 | def by_other_id(cls, other_id, db_session=None): | |||
|
56 | db_session = get_db_session(db_session) | |||
|
57 | query = db_session.query(AlertChannelAction) | |||
|
58 | return query.filter(AlertChannelAction.other_id == other_id) | |||
|
59 | ||||
|
60 | @classmethod | |||
|
61 | def by_resource_id(cls, resource_id, db_session=None): | |||
|
62 | db_session = get_db_session(db_session) | |||
|
63 | query = db_session.query(AlertChannelAction) | |||
|
64 | return query.filter(AlertChannelAction.resource_id == resource_id) |
@@ -0,0 +1,193 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | import urllib.parse | |||
|
24 | ||||
|
25 | import sqlalchemy as sa | |||
|
26 | from pyramid.threadlocal import get_current_request | |||
|
27 | ||||
|
28 | from appenlight.lib.cache_regions import get_region | |||
|
29 | from appenlight.lib.enums import ReportType | |||
|
30 | from appenlight.models import get_db_session | |||
|
31 | from appenlight.models.report_group import ReportGroup | |||
|
32 | from appenlight.models.application import Application | |||
|
33 | from appenlight.models.event import Event | |||
|
34 | from appenlight.models.services.base import BaseService | |||
|
35 | from appenlight.models.services.event import EventService | |||
|
36 | ||||
|
37 | log = logging.getLogger(__name__) | |||
|
38 | ||||
|
39 | ||||
|
40 | # cache_memory_min_1 = get_region('memory_min_1') | |||
|
41 | ||||
|
42 | class ApplicationService(BaseService): | |||
|
43 | @classmethod | |||
|
44 | def by_api_key(cls, api_key, db_session=None): | |||
|
45 | db_session = get_db_session(db_session) | |||
|
46 | q = db_session.query(Application) | |||
|
47 | q = q.filter(Application.api_key == api_key) | |||
|
48 | q = q.options(sa.orm.eagerload(Application.owner)) | |||
|
49 | return q.first() | |||
|
50 | ||||
|
51 | @classmethod | |||
|
52 | def by_api_key_cached(cls, db_session=None): | |||
|
53 | db_session = get_db_session(db_session) | |||
|
54 | cache_region = get_region('redis_min_1') | |||
|
55 | ||||
|
56 | @cache_region.cache_on_arguments('ApplicationService.by_api_key') | |||
|
57 | def cached(*args, **kwargs): | |||
|
58 | app = cls.by_api_key(*args, db_session=db_session, **kwargs) | |||
|
59 | if app: | |||
|
60 | db_session.expunge(app) | |||
|
61 | return app | |||
|
62 | ||||
|
63 | return cached | |||
|
64 | ||||
|
65 | @classmethod | |||
|
66 | def by_public_api_key(cls, api_key, db_session=None, from_cache=False, | |||
|
67 | request=None): | |||
|
68 | db_session = get_db_session(db_session) | |||
|
69 | cache_region = get_region('redis_min_1') | |||
|
70 | ||||
|
71 | def uncached(api_key): | |||
|
72 | q = db_session.query(Application) | |||
|
73 | q = q.filter(Application.public_key == api_key) | |||
|
74 | q = q.options(sa.orm.eagerload(Application.owner)) | |||
|
75 | return q.first() | |||
|
76 | ||||
|
77 | if from_cache: | |||
|
78 | @cache_region.cache_on_arguments( | |||
|
79 | 'ApplicationService.by_public_api_key') | |||
|
80 | def cached(api_key): | |||
|
81 | app = uncached(api_key) | |||
|
82 | if app: | |||
|
83 | db_session.expunge(app) | |||
|
84 | return app | |||
|
85 | ||||
|
86 | app = cached(api_key) | |||
|
87 | else: | |||
|
88 | app = uncached(api_key) | |||
|
89 | return app | |||
|
90 | ||||
|
91 | @classmethod | |||
|
92 | def by_id(cls, db_id, db_session=None): | |||
|
93 | db_session = get_db_session(db_session) | |||
|
94 | q = db_session.query(Application) | |||
|
95 | q = q.filter(Application.resource_id == db_id) | |||
|
96 | return q.first() | |||
|
97 | ||||
|
98 | @classmethod | |||
|
99 | def by_id_cached(cls, db_session=None): | |||
|
100 | db_session = get_db_session(db_session) | |||
|
101 | cache_region = get_region('redis_min_1') | |||
|
102 | ||||
|
103 | @cache_region.cache_on_arguments('ApplicationService.by_id') | |||
|
104 | def cached(*args, **kwargs): | |||
|
105 | app = cls.by_id(*args, db_session=db_session, **kwargs) | |||
|
106 | if app: | |||
|
107 | db_session.expunge(app) | |||
|
108 | return app | |||
|
109 | ||||
|
110 | return cached | |||
|
111 | ||||
|
112 | @classmethod | |||
|
113 | def by_ids(cls, db_ids, db_session=None): | |||
|
114 | db_session = get_db_session(db_session) | |||
|
115 | query = db_session.query(Application) | |||
|
116 | query = query.filter(Application.resource_id.in_(db_ids)) | |||
|
117 | return query | |||
|
118 | ||||
|
119 | @classmethod | |||
|
120 | def by_http_referer(cls, referer_string, db_session=None): | |||
|
121 | db_session = get_db_session(db_session) | |||
|
122 | domain = urllib.parse.urlsplit( | |||
|
123 | referer_string, allow_fragments=False).netloc | |||
|
124 | if domain: | |||
|
125 | if domain.startswith('www.'): | |||
|
126 | domain = domain[4:] | |||
|
127 | q = db_session.query(Application).filter(Application.domain == domain) | |||
|
128 | return q.first() | |||
|
129 | ||||
|
130 | @classmethod | |||
|
131 | def last_updated(cls, since_when, exclude_status=None, db_session=None): | |||
|
132 | db_session = get_db_session(db_session) | |||
|
133 | q = db_session.query(Application) | |||
|
134 | q2 = ReportGroup.last_updated( | |||
|
135 | since_when, exclude_status=exclude_status, db_session=db_session) | |||
|
136 | q2 = q2.from_self(ReportGroup.resource_id) | |||
|
137 | q2 = q2.group_by(ReportGroup.resource_id) | |||
|
138 | q = q.filter(Application.resource_id.in_(q2)) | |||
|
139 | return q | |||
|
140 | ||||
|
141 | @classmethod | |||
|
142 | def check_for_groups_alert(cls, resource, event_type, *args, **kwargs): | |||
|
143 | """ Check for open alerts depending on group type. | |||
|
144 | Create new one if nothing is found and send alerts """ | |||
|
145 | db_session = get_db_session(kwargs.get('db_session')) | |||
|
146 | request = get_current_request() | |||
|
147 | report_groups = kwargs['report_groups'] | |||
|
148 | occurence_dict = kwargs['occurence_dict'] | |||
|
149 | ||||
|
150 | error_reports = 0 | |||
|
151 | slow_reports = 0 | |||
|
152 | for group in report_groups: | |||
|
153 | occurences = occurence_dict.get(group.id, 1) | |||
|
154 | if group.get_report().report_type == ReportType.error: | |||
|
155 | error_reports += occurences | |||
|
156 | elif group.get_report().report_type == ReportType.slow: | |||
|
157 | slow_reports += occurences | |||
|
158 | ||||
|
159 | log_msg = 'LIMIT INFO: %s : %s error reports. %s slow_reports' % ( | |||
|
160 | resource, | |||
|
161 | error_reports, | |||
|
162 | slow_reports) | |||
|
163 | logging.warning(log_msg) | |||
|
164 | threshold = 10 | |||
|
165 | for event_type in ['error_report_alert', 'slow_report_alert']: | |||
|
166 | if (error_reports < resource.error_report_threshold and | |||
|
167 | event_type == 'error_report_alert'): | |||
|
168 | continue | |||
|
169 | elif (slow_reports <= resource.slow_report_threshold and | |||
|
170 | event_type == 'slow_report_alert'): | |||
|
171 | continue | |||
|
172 | if event_type == 'error_report_alert': | |||
|
173 | amount = error_reports | |||
|
174 | threshold = resource.error_report_threshold | |||
|
175 | elif event_type == 'slow_report_alert': | |||
|
176 | amount = slow_reports | |||
|
177 | threshold = resource.slow_report_threshold | |||
|
178 | ||||
|
179 | event = EventService.for_resource([resource.resource_id], | |||
|
180 | event_type=Event.types[ | |||
|
181 | event_type], | |||
|
182 | status=Event.statuses['active']) | |||
|
183 | if event.first(): | |||
|
184 | log.info('ALERT: PROGRESS: %s %s' % (event_type, resource)) | |||
|
185 | else: | |||
|
186 | log.warning('ALERT: OPEN: %s %s' % (event_type, resource)) | |||
|
187 | new_event = Event(resource_id=resource.resource_id, | |||
|
188 | event_type=Event.types[event_type], | |||
|
189 | status=Event.statuses['active'], | |||
|
190 | values={'reports': amount, | |||
|
191 | 'threshold': threshold}) | |||
|
192 | db_session.add(new_event) | |||
|
193 | new_event.send_alerts(request=request, resource=resource) |
@@ -0,0 +1,41 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | ||||
|
23 | from appenlight.models import get_db_session | |||
|
24 | from appenlight.models.application_postprocess_conf import ApplicationPostprocessConf | |||
|
25 | from appenlight.models.services.base import BaseService | |||
|
26 | ||||
|
27 | ||||
|
28 | class ApplicationPostprocessConfService(BaseService): | |||
|
29 | ||||
|
30 | @classmethod | |||
|
31 | def by_pkey(cls, pkey, db_session=None): | |||
|
32 | db_session = get_db_session(db_session) | |||
|
33 | query = db_session.query(ApplicationPostprocessConf) | |||
|
34 | return query.filter(ApplicationPostprocessConf.pkey == pkey).first() | |||
|
35 | ||||
|
36 | @classmethod | |||
|
37 | def by_pkey_and_resource_id(cls, pkey, resource_id, db_session=None): | |||
|
38 | db_session = get_db_session(db_session) | |||
|
39 | query = db_session.query(ApplicationPostprocessConf) | |||
|
40 | query = query.filter(ApplicationPostprocessConf.resource_id == resource_id) | |||
|
41 | return query.filter(ApplicationPostprocessConf.pkey == pkey).first() |
@@ -0,0 +1,36 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | ||||
|
24 | from appenlight.models.auth_token import AuthToken | |||
|
25 | from appenlight.models.services.base import BaseService | |||
|
26 | from ziggurat_foundations.models.base import get_db_session | |||
|
27 | ||||
|
28 | log = logging.getLogger(__name__) | |||
|
29 | ||||
|
30 | ||||
|
31 | class AuthTokenService(BaseService): | |||
|
32 | @classmethod | |||
|
33 | def by_token(cls, token, db_session=None): | |||
|
34 | db_session = get_db_session(db_session) | |||
|
35 | query = db_session.query(AuthToken).filter(AuthToken.token == token) | |||
|
36 | return query.first() |
@@ -0,0 +1,24 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | ||||
|
23 | class BaseService(object): | |||
|
24 | pass |
@@ -0,0 +1,105 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | ||||
|
24 | import sqlalchemy as sa | |||
|
25 | from pyramid.threadlocal import get_current_registry | |||
|
26 | ||||
|
27 | from appenlight.models.config import Config | |||
|
28 | from appenlight.models.services.base import BaseService | |||
|
29 | from appenlight.models import get_db_session | |||
|
30 | ||||
|
31 | log = logging.getLogger(__name__) | |||
|
32 | ||||
|
33 | ||||
|
34 | class ConfigService(BaseService): | |||
|
35 | @classmethod | |||
|
36 | def all(cls, db_session=None): | |||
|
37 | db_session = get_db_session(db_session) | |||
|
38 | query = db_session.query(Config) | |||
|
39 | return query | |||
|
40 | ||||
|
41 | @classmethod | |||
|
42 | def filtered_key_and_section(cls, pairs=None, db_session=None): | |||
|
43 | db_session = get_db_session(db_session) | |||
|
44 | query = db_session.query(Config) | |||
|
45 | if pairs: | |||
|
46 | conditions = [] | |||
|
47 | for pair in pairs: | |||
|
48 | conditions.append(sa.and_( | |||
|
49 | Config.key == pair['key'], | |||
|
50 | Config.section == pair['section']) | |||
|
51 | ) | |||
|
52 | ||||
|
53 | query = query.filter(sa.or_(*conditions)) | |||
|
54 | return query | |||
|
55 | ||||
|
56 | @classmethod | |||
|
57 | def create_config(cls, key, section, value=None, db_session=None): | |||
|
58 | config = Config(key=key, section=section, value=value) | |||
|
59 | db_session = get_db_session(db_session) | |||
|
60 | db_session.add(config) | |||
|
61 | db_session.flush() | |||
|
62 | return config | |||
|
63 | ||||
|
64 | @classmethod | |||
|
65 | def by_key_and_section(cls, key, section, auto_create=False, | |||
|
66 | default_value=None, db_session=None): | |||
|
67 | db_session = get_db_session(db_session) | |||
|
68 | registry = get_current_registry() | |||
|
69 | ||||
|
70 | @registry.cache_regions.memory_min_1.cache_on_arguments( | |||
|
71 | namespace='ConfigService.by_key_and_section') | |||
|
72 | def cached(key, section): | |||
|
73 | query = db_session.query(Config).filter(Config.key == key) | |||
|
74 | query = query.filter(Config.section == section) | |||
|
75 | config = query.first() | |||
|
76 | if config: | |||
|
77 | db_session.expunge(config) | |||
|
78 | return config | |||
|
79 | ||||
|
80 | config = cached(key, section) | |||
|
81 | if config: | |||
|
82 | config = db_session.merge(config, load=False) | |||
|
83 | if config is None and auto_create: | |||
|
84 | config = ConfigService.create_config(key, section, | |||
|
85 | value=default_value) | |||
|
86 | cached.invalidate(key, section) | |||
|
87 | return config | |||
|
88 | ||||
|
89 | @classmethod | |||
|
90 | def setup_default_values(self): | |||
|
91 | """ | |||
|
92 | Will add fresh default config values to database if no keys are found | |||
|
93 | :return: | |||
|
94 | """ | |||
|
95 | log.info('Checking/setting default values') | |||
|
96 | self.by_key_and_section('template_footer_html', 'global', | |||
|
97 | default_value='', auto_create=True) | |||
|
98 | self.by_key_and_section('list_groups_to_non_admins', 'global', | |||
|
99 | default_value=True, auto_create=True) | |||
|
100 | self.by_key_and_section('per_application_reports_rate_limit', 'global', | |||
|
101 | default_value=2000, auto_create=True) | |||
|
102 | self.by_key_and_section('per_application_logs_rate_limit', 'global', | |||
|
103 | default_value=100000, auto_create=True) | |||
|
104 | self.by_key_and_section('per_application_metrics_rate_limit', 'global', | |||
|
105 | default_value=100000, auto_create=True) |
@@ -0,0 +1,114 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import sqlalchemy as sa | |||
|
23 | from pyramid.threadlocal import get_current_registry | |||
|
24 | from paginate_sqlalchemy import SqlalchemyOrmPage | |||
|
25 | from appenlight.models import get_db_session | |||
|
26 | from appenlight.models.event import Event | |||
|
27 | from appenlight.models.services.base import BaseService | |||
|
28 | ||||
|
29 | ||||
|
30 | class EventService(BaseService): | |||
|
31 | @classmethod | |||
|
32 | def for_resource(cls, resource_ids, event_type=None, status=None, | |||
|
33 | since_when=None, limit=20, event_id=None, | |||
|
34 | target_uuid=None, order_by=None, or_target_user_id=None, | |||
|
35 | db_session=None): | |||
|
36 | """ | |||
|
37 | Fetches events including based on passed params OR if target_user_id | |||
|
38 | is present include events that just target this user | |||
|
39 | """ | |||
|
40 | db_session = get_db_session(db_session) | |||
|
41 | query = db_session.query(Event) | |||
|
42 | query = query.options(sa.orm.joinedload(Event.resource)) | |||
|
43 | and_cond = [Event.resource_id.in_(resource_ids)] | |||
|
44 | if not resource_ids: | |||
|
45 | and_cond = [Event.resource_id == -999] | |||
|
46 | ||||
|
47 | if event_type: | |||
|
48 | and_cond.append(Event.event_type == event_type) | |||
|
49 | if status: | |||
|
50 | and_cond.append(Event.status == status) | |||
|
51 | if since_when: | |||
|
52 | and_cond.append(Event.start_date >= since_when) | |||
|
53 | if event_id: | |||
|
54 | and_cond.append(Event.id == event_id) | |||
|
55 | if target_uuid: | |||
|
56 | and_cond.append(Event.target_uuid == target_uuid) | |||
|
57 | ||||
|
58 | or_cond = [] | |||
|
59 | ||||
|
60 | if or_target_user_id: | |||
|
61 | or_cond.append(sa.or_(Event.target_user_id == or_target_user_id)) | |||
|
62 | ||||
|
63 | query = query.filter(sa.or_(sa.and_(*and_cond), | |||
|
64 | *or_cond)) | |||
|
65 | if not order_by: | |||
|
66 | query = query.order_by(sa.desc(Event.start_date)) | |||
|
67 | if limit: | |||
|
68 | query = query.limit(limit) | |||
|
69 | ||||
|
70 | return query | |||
|
71 | ||||
|
72 | @classmethod | |||
|
73 | def by_type_and_status(cls, event_types, status_types, since_when=None, | |||
|
74 | older_than=None, db_session=None, app_ids=None): | |||
|
75 | db_session = get_db_session(db_session) | |||
|
76 | query = db_session.query(Event) | |||
|
77 | query = query.filter(Event.event_type.in_(event_types)) | |||
|
78 | query = query.filter(Event.status.in_(status_types)) | |||
|
79 | if since_when: | |||
|
80 | query = query.filter(Event.start_date >= since_when) | |||
|
81 | if older_than: | |||
|
82 | query = query.filter(Event.start_date <= older_than) | |||
|
83 | if app_ids: | |||
|
84 | query = query.filter(Event.resource_id.in_(app_ids)) | |||
|
85 | return query | |||
|
86 | ||||
|
87 | @classmethod | |||
|
88 | def latest_for_user(cls, user, db_session=None): | |||
|
89 | registry = get_current_registry() | |||
|
90 | resources = user.resources_with_perms( | |||
|
91 | ['view'], resource_types=registry.resource_types) | |||
|
92 | resource_ids = [r.resource_id for r in resources] | |||
|
93 | db_session = get_db_session(db_session) | |||
|
94 | return EventService.for_resource( | |||
|
95 | resource_ids, or_target_user_id=user.id, limit=10, | |||
|
96 | db_session=db_session) | |||
|
97 | ||||
|
98 | @classmethod | |||
|
99 | def get_paginator(cls, user, page=1, item_count=None, items_per_page=50, | |||
|
100 | order_by=None, filter_settings=None, db_session=None): | |||
|
101 | if not filter_settings: | |||
|
102 | filter_settings = {} | |||
|
103 | registry = get_current_registry() | |||
|
104 | resources = user.resources_with_perms( | |||
|
105 | ['view'], resource_types=registry.resource_types) | |||
|
106 | resource_ids = [r.resource_id for r in resources] | |||
|
107 | query = EventService.for_resource( | |||
|
108 | resource_ids, or_target_user_id=user.id, limit=100, | |||
|
109 | db_session=db_session) | |||
|
110 | ||||
|
111 | paginator = SqlalchemyOrmPage(query, page=page, | |||
|
112 | items_per_page=items_per_page, | |||
|
113 | **filter_settings) | |||
|
114 | return paginator |
@@ -0,0 +1,32 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from appenlight.models import get_db_session | |||
|
23 | from appenlight.models.group import Group | |||
|
24 | from appenlight.models.services.base import BaseService | |||
|
25 | ||||
|
26 | ||||
|
27 | class GroupService(BaseService): | |||
|
28 | @classmethod | |||
|
29 | def by_id(cls, group_id, db_session=None): | |||
|
30 | db_session = get_db_session(db_session) | |||
|
31 | query = db_session.query(Group).filter(Group.id == group_id) | |||
|
32 | return query.first() |
@@ -0,0 +1,38 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from appenlight.models.group_resource_permission import GroupResourcePermission | |||
|
23 | from appenlight.models import get_db_session | |||
|
24 | from appenlight.models.services.base import BaseService | |||
|
25 | ||||
|
26 | ||||
|
27 | class GroupResourcePermissionService(BaseService): | |||
|
28 | @classmethod | |||
|
29 | def by_resource_group_and_perm(cls, group_id, perm_name, resource_id, | |||
|
30 | db_session=None): | |||
|
31 | """ return all instances by user name, perm name and resource id """ | |||
|
32 | db_session = get_db_session(db_session) | |||
|
33 | query = db_session.query(GroupResourcePermission) | |||
|
34 | query = query.filter(GroupResourcePermission.group_id == group_id) | |||
|
35 | query = query.filter( | |||
|
36 | GroupResourcePermission.resource_id == resource_id) | |||
|
37 | query = query.filter(GroupResourcePermission.perm_name == perm_name) | |||
|
38 | return query.first() |
@@ -0,0 +1,207 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import paginate | |||
|
23 | import logging | |||
|
24 | import sqlalchemy as sa | |||
|
25 | ||||
|
26 | from appenlight.models.log import Log | |||
|
27 | from appenlight.models import get_db_session, Datastores | |||
|
28 | from appenlight.models.services.base import BaseService | |||
|
29 | from appenlight.lib.utils import es_index_name_limiter | |||
|
30 | ||||
|
31 | log = logging.getLogger(__name__) | |||
|
32 | ||||
|
33 | ||||
|
34 | class LogService(BaseService): | |||
|
35 | @classmethod | |||
|
36 | def get_logs(cls, resource_ids=None, filter_settings=None, | |||
|
37 | db_session=None): | |||
|
38 | # ensure we always have id's passed | |||
|
39 | if not resource_ids: | |||
|
40 | # raise Exception('No App ID passed') | |||
|
41 | return [] | |||
|
42 | db_session = get_db_session(db_session) | |||
|
43 | q = db_session.query(Log) | |||
|
44 | q = q.filter(Log.resource_id.in_(resource_ids)) | |||
|
45 | if filter_settings.get('start_date'): | |||
|
46 | q = q.filter(Log.timestamp >= filter_settings.get('start_date')) | |||
|
47 | if filter_settings.get('end_date'): | |||
|
48 | q = q.filter(Log.timestamp <= filter_settings.get('end_date')) | |||
|
49 | if filter_settings.get('log_level'): | |||
|
50 | q = q.filter( | |||
|
51 | Log.log_level == filter_settings.get('log_level').upper()) | |||
|
52 | if filter_settings.get('request_id'): | |||
|
53 | request_id = filter_settings.get('request_id', '') | |||
|
54 | q = q.filter(Log.request_id == request_id.replace('-', '')) | |||
|
55 | if filter_settings.get('namespace'): | |||
|
56 | q = q.filter(Log.namespace == filter_settings.get('namespace')) | |||
|
57 | q = q.order_by(sa.desc(Log.timestamp)) | |||
|
58 | return q | |||
|
59 | ||||
|
60 | @classmethod | |||
|
61 | def es_query_builder(cls, app_ids, filter_settings): | |||
|
62 | if not filter_settings: | |||
|
63 | filter_settings = {} | |||
|
64 | ||||
|
65 | query = { | |||
|
66 | "query": { | |||
|
67 | "filtered": { | |||
|
68 | "filter": { | |||
|
69 | "and": [{"terms": {"resource_id": list(app_ids)}}] | |||
|
70 | } | |||
|
71 | } | |||
|
72 | } | |||
|
73 | } | |||
|
74 | ||||
|
75 | start_date = filter_settings.get('start_date') | |||
|
76 | end_date = filter_settings.get('end_date') | |||
|
77 | filter_part = query['query']['filtered']['filter']['and'] | |||
|
78 | ||||
|
79 | for tag in filter_settings.get('tags', []): | |||
|
80 | tag_values = [v.lower() for v in tag['value']] | |||
|
81 | key = "tags.%s.values" % tag['name'].replace('.', '_') | |||
|
82 | filter_part.append({"terms": {key: tag_values}}) | |||
|
83 | ||||
|
84 | date_range = {"range": {"timestamp": {}}} | |||
|
85 | if start_date: | |||
|
86 | date_range["range"]["timestamp"]["gte"] = start_date | |||
|
87 | if end_date: | |||
|
88 | date_range["range"]["timestamp"]["lte"] = end_date | |||
|
89 | if start_date or end_date: | |||
|
90 | filter_part.append(date_range) | |||
|
91 | ||||
|
92 | levels = filter_settings.get('level') | |||
|
93 | if levels: | |||
|
94 | filter_part.append({"terms": {'log_level': levels}}) | |||
|
95 | namespaces = filter_settings.get('namespace') | |||
|
96 | if namespaces: | |||
|
97 | filter_part.append({"terms": {'namespace': namespaces}}) | |||
|
98 | ||||
|
99 | request_ids = filter_settings.get('request_id') | |||
|
100 | if request_ids: | |||
|
101 | filter_part.append({"terms": {'request_id': request_ids}}) | |||
|
102 | ||||
|
103 | messages = filter_settings.get('message') | |||
|
104 | if messages: | |||
|
105 | query['query']['filtered']['query'] = { | |||
|
106 | 'match': {"message": ' '.join(messages)}} | |||
|
107 | return query | |||
|
108 | ||||
|
109 | @classmethod | |||
|
110 | def get_time_series_aggregate(cls, app_ids=None, filter_settings=None): | |||
|
111 | if not app_ids: | |||
|
112 | return {} | |||
|
113 | es_query = cls.es_query_builder(app_ids, filter_settings) | |||
|
114 | es_query["aggs"] = { | |||
|
115 | "events_over_time": { | |||
|
116 | "date_histogram": { | |||
|
117 | "field": "timestamp", | |||
|
118 | "interval": "1h", | |||
|
119 | "min_doc_count": 0 | |||
|
120 | } | |||
|
121 | } | |||
|
122 | } | |||
|
123 | log.debug(es_query) | |||
|
124 | index_names = es_index_name_limiter(filter_settings.get('start_date'), | |||
|
125 | filter_settings.get('end_date'), | |||
|
126 | ixtypes=['logs']) | |||
|
127 | if index_names: | |||
|
128 | results = Datastores.es.search( | |||
|
129 | es_query, index=index_names, doc_type='log', size=0) | |||
|
130 | else: | |||
|
131 | results = [] | |||
|
132 | return results | |||
|
133 | ||||
|
134 | @classmethod | |||
|
135 | def get_search_iterator(cls, app_ids=None, page=1, items_per_page=50, | |||
|
136 | order_by=None, filter_settings=None, limit=None): | |||
|
137 | if not app_ids: | |||
|
138 | return {}, 0 | |||
|
139 | ||||
|
140 | es_query = cls.es_query_builder(app_ids, filter_settings) | |||
|
141 | sort_query = { | |||
|
142 | "sort": [ | |||
|
143 | {"timestamp": {"order": "desc"}} | |||
|
144 | ] | |||
|
145 | } | |||
|
146 | es_query.update(sort_query) | |||
|
147 | log.debug(es_query) | |||
|
148 | es_from = (page - 1) * items_per_page | |||
|
149 | index_names = es_index_name_limiter(filter_settings.get('start_date'), | |||
|
150 | filter_settings.get('end_date'), | |||
|
151 | ixtypes=['logs']) | |||
|
152 | if not index_names: | |||
|
153 | return {}, 0 | |||
|
154 | ||||
|
155 | results = Datastores.es.search(es_query, index=index_names, | |||
|
156 | doc_type='log', size=items_per_page, | |||
|
157 | es_from=es_from) | |||
|
158 | if results['hits']['total'] > 5000: | |||
|
159 | count = 5000 | |||
|
160 | else: | |||
|
161 | count = results['hits']['total'] | |||
|
162 | return results['hits'], count | |||
|
163 | ||||
|
164 | @classmethod | |||
|
165 | def get_paginator_by_app_ids(cls, app_ids=None, page=1, item_count=None, | |||
|
166 | items_per_page=50, order_by=None, | |||
|
167 | filter_settings=None, | |||
|
168 | exclude_columns=None, db_session=None): | |||
|
169 | if not filter_settings: | |||
|
170 | filter_settings = {} | |||
|
171 | results, item_count = cls.get_search_iterator(app_ids, page, | |||
|
172 | items_per_page, order_by, | |||
|
173 | filter_settings) | |||
|
174 | paginator = paginate.Page([], | |||
|
175 | item_count=item_count, | |||
|
176 | items_per_page=items_per_page, | |||
|
177 | **filter_settings) | |||
|
178 | ordered_ids = tuple(item['_source']['pg_id'] | |||
|
179 | for item in results.get('hits', [])) | |||
|
180 | ||||
|
181 | sorted_instance_list = [] | |||
|
182 | if ordered_ids: | |||
|
183 | db_session = get_db_session(db_session) | |||
|
184 | query = db_session.query(Log) | |||
|
185 | query = query.filter(Log.log_id.in_(ordered_ids)) | |||
|
186 | query = query.order_by(sa.desc('timestamp')) | |||
|
187 | sa_items = query.all() | |||
|
188 | # resort by score | |||
|
189 | for i_id in ordered_ids: | |||
|
190 | for item in sa_items: | |||
|
191 | if str(item.log_id) == str(i_id): | |||
|
192 | sorted_instance_list.append(item) | |||
|
193 | paginator.sa_items = sorted_instance_list | |||
|
194 | return paginator | |||
|
195 | ||||
|
196 | @classmethod | |||
|
197 | def query_by_primary_key_and_namespace(cls, list_of_pairs, | |||
|
198 | db_session=None): | |||
|
199 | db_session = get_db_session(db_session) | |||
|
200 | list_of_conditions = [] | |||
|
201 | query = db_session.query(Log) | |||
|
202 | for pair in list_of_pairs: | |||
|
203 | list_of_conditions.append(sa.and_( | |||
|
204 | Log.primary_key == pair['pk'], Log.namespace == pair['ns'])) | |||
|
205 | query = query.filter(sa.or_(*list_of_conditions)) | |||
|
206 | query = query.order_by(sa.asc(Log.timestamp), sa.asc(Log.log_id)) | |||
|
207 | return query |
@@ -0,0 +1,57 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | ||||
|
24 | from appenlight.models.plugin_config import PluginConfig | |||
|
25 | from appenlight.models.services.base import BaseService | |||
|
26 | from appenlight.models import get_db_session | |||
|
27 | ||||
|
28 | log = logging.getLogger(__name__) | |||
|
29 | ||||
|
30 | ||||
|
31 | class PluginConfigService(BaseService): | |||
|
32 | @classmethod | |||
|
33 | def all(cls, db_session=None): | |||
|
34 | db_session = get_db_session(db_session) | |||
|
35 | query = db_session.query(PluginConfig) | |||
|
36 | return query | |||
|
37 | ||||
|
38 | @classmethod | |||
|
39 | def by_id(cls, plugin_id, db_session=None): | |||
|
40 | db_session = get_db_session(db_session) | |||
|
41 | query = db_session.query(PluginConfig) | |||
|
42 | query = query.filter(PluginConfig.id == plugin_id) | |||
|
43 | return query.first() | |||
|
44 | ||||
|
45 | @classmethod | |||
|
46 | def by_query(cls, resource_id=None, plugin_name=None, | |||
|
47 | section=None, db_session=None): | |||
|
48 | db_session = get_db_session(db_session) | |||
|
49 | ||||
|
50 | query = db_session.query(PluginConfig) | |||
|
51 | if resource_id: | |||
|
52 | query = query.filter(PluginConfig.resource_id == resource_id) | |||
|
53 | if plugin_name: | |||
|
54 | query = query.filter(PluginConfig.plugin_name == plugin_name) | |||
|
55 | if section: | |||
|
56 | query = query.filter(PluginConfig.section == section) | |||
|
57 | return query |
@@ -0,0 +1,62 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | import sqlalchemy as sa | |||
|
24 | ||||
|
25 | from appenlight.models import get_db_session | |||
|
26 | from appenlight.models.report import Report | |||
|
27 | from appenlight.models.report_stat import ReportStat | |||
|
28 | from appenlight.models.services.base import BaseService | |||
|
29 | ||||
|
30 | log = logging.getLogger(__name__) | |||
|
31 | ||||
|
32 | ||||
|
33 | class ReportService(BaseService): | |||
|
34 | @classmethod | |||
|
35 | def by_app_ids(cls, app_ids=None, order_by=True, db_session=None): | |||
|
36 | db_session = get_db_session(db_session) | |||
|
37 | q = db_session.query(Report) | |||
|
38 | if app_ids: | |||
|
39 | q = q.filter(Report.resource_id.in_(app_ids)) | |||
|
40 | if order_by: | |||
|
41 | q = q.order_by(sa.desc(Report.id)) | |||
|
42 | return q | |||
|
43 | ||||
|
44 | @classmethod | |||
|
45 | def generate_stat_rows(cls, report, resource, report_group, occurences=1, | |||
|
46 | db_session=None): | |||
|
47 | """ | |||
|
48 | Generates timeseries for this report's group | |||
|
49 | """ | |||
|
50 | db_session = get_db_session(db_session) | |||
|
51 | stats = ReportStat(resource_id=report.resource_id, | |||
|
52 | group_id=report_group.id, | |||
|
53 | start_interval=report.start_time, | |||
|
54 | owner_user_id=resource.owner_user_id, | |||
|
55 | server_name=report.tags.get('server_name'), | |||
|
56 | view_name=report.tags.get('view_name'), | |||
|
57 | type=report.report_type, | |||
|
58 | occurences=occurences, | |||
|
59 | duration=report.duration) | |||
|
60 | db_session.add(stats) | |||
|
61 | db_session.flush() | |||
|
62 | return stats |
@@ -0,0 +1,33 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from appenlight.models import get_db_session | |||
|
23 | from appenlight.models.services.base import BaseService | |||
|
24 | ||||
|
25 | ||||
|
26 | class ReportAssignmentService(BaseService): | |||
|
27 | @classmethod | |||
|
28 | def by_group_id_and_user(cls, group_id, user_name, db_session=None): | |||
|
29 | db_session = get_db_session(db_session) | |||
|
30 | ||||
|
31 | query = db_session.query(cls).filter(cls.group_id == group_id) | |||
|
32 | query = query.filter(cls.user_name == user_name) | |||
|
33 | return query.first() |
@@ -0,0 +1,454 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | import paginate | |||
|
24 | import sqlalchemy as sa | |||
|
25 | import appenlight.lib.helpers as h | |||
|
26 | ||||
|
27 | from datetime import datetime | |||
|
28 | ||||
|
29 | from appenlight.models import get_db_session, Datastores | |||
|
30 | from appenlight.models.report import Report | |||
|
31 | from appenlight.models.report_group import ReportGroup | |||
|
32 | from appenlight.models.report_comment import ReportComment | |||
|
33 | from appenlight.models.user import User | |||
|
34 | from appenlight.models.services.base import BaseService | |||
|
35 | from appenlight.lib.enums import ReportType | |||
|
36 | from appenlight.lib.utils import es_index_name_limiter | |||
|
37 | ||||
|
38 | log = logging.getLogger(__name__) | |||
|
39 | ||||
|
40 | ||||
|
41 | class ReportGroupService(BaseService): | |||
|
42 | @classmethod | |||
|
43 | def get_trending(cls, request, filter_settings, limit=15, | |||
|
44 | db_session=None): | |||
|
45 | """ | |||
|
46 | Returns report groups trending for specific time interval | |||
|
47 | """ | |||
|
48 | db_session = get_db_session(db_session) | |||
|
49 | ||||
|
50 | tags = [] | |||
|
51 | if filter_settings.get('tags'): | |||
|
52 | for tag in filter_settings['tags']: | |||
|
53 | tags.append( | |||
|
54 | {'terms': { | |||
|
55 | 'tags.{}.values'.format(tag['name']): tag['value']}}) | |||
|
56 | ||||
|
57 | index_names = es_index_name_limiter( | |||
|
58 | start_date=filter_settings['start_date'], | |||
|
59 | end_date=filter_settings['end_date'], | |||
|
60 | ixtypes=['reports']) | |||
|
61 | ||||
|
62 | if not index_names or not filter_settings['resource']: | |||
|
63 | return [] | |||
|
64 | ||||
|
65 | es_query = { | |||
|
66 | 'aggs': {'parent_agg': {'aggs': {'groups': {'aggs': { | |||
|
67 | 'sub_agg': { | |||
|
68 | 'value_count': {'field': 'tags.group_id.values'}}}, | |||
|
69 | 'filter': {'exists': {'field': 'tags.group_id.values'}}}}, | |||
|
70 | 'terms': {'field': 'tags.group_id.values', 'size': limit}}}, | |||
|
71 | 'query': {'filtered': { | |||
|
72 | 'filter': {'and': [ | |||
|
73 | {'terms': { | |||
|
74 | 'resource_id': [filter_settings['resource'][0]]} | |||
|
75 | }, | |||
|
76 | {'range': {'timestamp': { | |||
|
77 | 'gte': filter_settings['start_date'], | |||
|
78 | 'lte': filter_settings['end_date']}}}] | |||
|
79 | } | |||
|
80 | }} | |||
|
81 | } | |||
|
82 | if tags: | |||
|
83 | es_query['query']['filtered']['filter']['and'].extend(tags) | |||
|
84 | ||||
|
85 | result = Datastores.es.search( | |||
|
86 | es_query, index=index_names, doc_type='log', size=0) | |||
|
87 | series = [] | |||
|
88 | for bucket in result['aggregations']['parent_agg']['buckets']: | |||
|
89 | series.append({ | |||
|
90 | 'key': bucket['key'], | |||
|
91 | 'groups': bucket['groups']['sub_agg']['value'] | |||
|
92 | }) | |||
|
93 | ||||
|
94 | report_groups_d = {} | |||
|
95 | for g in series: | |||
|
96 | report_groups_d[int(g['key'])] = g['groups'] or 0 | |||
|
97 | ||||
|
98 | query = db_session.query(ReportGroup) | |||
|
99 | query = query.filter(ReportGroup.id.in_(list(report_groups_d.keys()))) | |||
|
100 | query = query.options( | |||
|
101 | sa.orm.joinedload(ReportGroup.last_report_ref)) | |||
|
102 | results = [(report_groups_d[group.id], group,) for group in query] | |||
|
103 | return sorted(results, reverse=True, key=lambda x:x[0]) | |||
|
104 | ||||
|
105 | @classmethod | |||
|
106 | def get_search_iterator(cls, app_ids=None, page=1, items_per_page=50, | |||
|
107 | order_by=None, filter_settings=None, limit=None): | |||
|
108 | if not app_ids: | |||
|
109 | return {} | |||
|
110 | if not filter_settings: | |||
|
111 | filter_settings = {} | |||
|
112 | ||||
|
113 | query = { | |||
|
114 | "size": 0, | |||
|
115 | "query": { | |||
|
116 | "filtered": { | |||
|
117 | "filter": { | |||
|
118 | "and": [{"terms": {"resource_id": list(app_ids)}}] | |||
|
119 | } | |||
|
120 | } | |||
|
121 | }, | |||
|
122 | ||||
|
123 | "aggs": { | |||
|
124 | "top_groups": { | |||
|
125 | "terms": { | |||
|
126 | "size": 5000, | |||
|
127 | "field": "_parent", | |||
|
128 | "order": { | |||
|
129 | "newest": "desc" | |||
|
130 | } | |||
|
131 | }, | |||
|
132 | "aggs": { | |||
|
133 | "top_reports_hits": { | |||
|
134 | "top_hits": {"size": 1, | |||
|
135 | "sort": {"start_time": "desc"} | |||
|
136 | } | |||
|
137 | }, | |||
|
138 | "newest": { | |||
|
139 | "max": {"field": "start_time"} | |||
|
140 | } | |||
|
141 | } | |||
|
142 | } | |||
|
143 | } | |||
|
144 | } | |||
|
145 | ||||
|
146 | start_date = filter_settings.get('start_date') | |||
|
147 | end_date = filter_settings.get('end_date') | |||
|
148 | filter_part = query['query']['filtered']['filter']['and'] | |||
|
149 | date_range = {"range": {"start_time": {}}} | |||
|
150 | if start_date: | |||
|
151 | date_range["range"]["start_time"]["gte"] = start_date | |||
|
152 | if end_date: | |||
|
153 | date_range["range"]["start_time"]["lte"] = end_date | |||
|
154 | if start_date or end_date: | |||
|
155 | filter_part.append(date_range) | |||
|
156 | ||||
|
157 | priorities = filter_settings.get('priority') | |||
|
158 | ||||
|
159 | for tag in filter_settings.get('tags', []): | |||
|
160 | tag_values = [v.lower() for v in tag['value']] | |||
|
161 | key = "tags.%s.values" % tag['name'].replace('.', '_') | |||
|
162 | filter_part.append({"terms": {key: tag_values}}) | |||
|
163 | ||||
|
164 | if priorities: | |||
|
165 | filter_part.append({"has_parent": { | |||
|
166 | "parent_type": "report_group", | |||
|
167 | "query": { | |||
|
168 | "terms": {'priority': priorities} | |||
|
169 | }}}) | |||
|
170 | ||||
|
171 | min_occurences = filter_settings.get('min_occurences') | |||
|
172 | if min_occurences: | |||
|
173 | filter_part.append({"has_parent": { | |||
|
174 | "parent_type": "report_group", | |||
|
175 | "query": { | |||
|
176 | "range": {'occurences': {"gte": min_occurences[0]}} | |||
|
177 | }}}) | |||
|
178 | ||||
|
179 | min_duration = filter_settings.get('min_duration') | |||
|
180 | max_duration = filter_settings.get('max_duration') | |||
|
181 | ||||
|
182 | request_ids = filter_settings.get('request_id') | |||
|
183 | if request_ids: | |||
|
184 | filter_part.append({"terms": {'request_id': request_ids}}) | |||
|
185 | ||||
|
186 | duration_range = {"range": {"average_duration": {}}} | |||
|
187 | if min_duration: | |||
|
188 | duration_range["range"]["average_duration"]["gte"] = \ | |||
|
189 | min_duration[0] | |||
|
190 | if max_duration: | |||
|
191 | duration_range["range"]["average_duration"]["lte"] = \ | |||
|
192 | max_duration[0] | |||
|
193 | if min_duration or max_duration: | |||
|
194 | filter_part.append({"has_parent": { | |||
|
195 | "parent_type": "report_group", | |||
|
196 | "query": duration_range}}) | |||
|
197 | ||||
|
198 | http_status = filter_settings.get('http_status') | |||
|
199 | report_type = filter_settings.get('report_type', [ReportType.error]) | |||
|
200 | # set error report type if http status is not found | |||
|
201 | # and we are dealing with slow reports | |||
|
202 | if not http_status or ReportType.slow in report_type: | |||
|
203 | filter_part.append({"terms": {'report_type': report_type}}) | |||
|
204 | if http_status: | |||
|
205 | filter_part.append({"terms": {'http_status': http_status}}) | |||
|
206 | ||||
|
207 | messages = filter_settings.get('message') | |||
|
208 | if messages: | |||
|
209 | condition = {'match': {"message": ' '.join(messages)}} | |||
|
210 | query['query']['filtered']['query'] = condition | |||
|
211 | errors = filter_settings.get('error') | |||
|
212 | if errors: | |||
|
213 | condition = {'match': {"error": ' '.join(errors)}} | |||
|
214 | query['query']['filtered']['query'] = condition | |||
|
215 | url_domains = filter_settings.get('url_domain') | |||
|
216 | if url_domains: | |||
|
217 | condition = {'terms': {"url_domain": url_domains}} | |||
|
218 | query['query']['filtered']['query'] = condition | |||
|
219 | url_paths = filter_settings.get('url_path') | |||
|
220 | if url_paths: | |||
|
221 | condition = {'terms': {"url_path": url_paths}} | |||
|
222 | query['query']['filtered']['query'] = condition | |||
|
223 | ||||
|
224 | if filter_settings.get('report_status'): | |||
|
225 | for status in filter_settings.get('report_status'): | |||
|
226 | if status == 'never_reviewed': | |||
|
227 | filter_part.append({"has_parent": { | |||
|
228 | "parent_type": "report_group", | |||
|
229 | "query": { | |||
|
230 | "term": {"read": False} | |||
|
231 | }}}) | |||
|
232 | elif status == 'reviewed': | |||
|
233 | filter_part.append({"has_parent": { | |||
|
234 | "parent_type": "report_group", | |||
|
235 | "query": { | |||
|
236 | "term": {"read": True} | |||
|
237 | }}}) | |||
|
238 | elif status == 'public': | |||
|
239 | filter_part.append({"has_parent": { | |||
|
240 | "parent_type": "report_group", | |||
|
241 | "query": { | |||
|
242 | "term": {"public": True} | |||
|
243 | }}}) | |||
|
244 | elif status == 'fixed': | |||
|
245 | filter_part.append({"has_parent": { | |||
|
246 | "parent_type": "report_group", | |||
|
247 | "query": { | |||
|
248 | "term": {"fixed": True} | |||
|
249 | }}}) | |||
|
250 | ||||
|
251 | # logging.getLogger('pyelasticsearch').setLevel(logging.DEBUG) | |||
|
252 | index_names = es_index_name_limiter(filter_settings.get('start_date'), | |||
|
253 | filter_settings.get('end_date'), | |||
|
254 | ixtypes=['reports']) | |||
|
255 | if index_names: | |||
|
256 | results = Datastores.es.search( | |||
|
257 | query, index=index_names, doc_type=["report", "report_group"], | |||
|
258 | size=0) | |||
|
259 | else: | |||
|
260 | return [] | |||
|
261 | return results['aggregations'] | |||
|
262 | ||||
|
263 | @classmethod | |||
|
264 | def get_paginator_by_app_ids(cls, app_ids=None, page=1, item_count=None, | |||
|
265 | items_per_page=50, order_by=None, | |||
|
266 | filter_settings=None, | |||
|
267 | exclude_columns=None, db_session=None): | |||
|
268 | if not filter_settings: | |||
|
269 | filter_settings = {} | |||
|
270 | results = cls.get_search_iterator(app_ids, page, items_per_page, | |||
|
271 | order_by, filter_settings) | |||
|
272 | ||||
|
273 | ordered_ids = [] | |||
|
274 | if results: | |||
|
275 | for item in results['top_groups']['buckets']: | |||
|
276 | pg_id = item['top_reports_hits']['hits']['hits'][0]['_source'][ | |||
|
277 | 'pg_id'] | |||
|
278 | ordered_ids.append(pg_id) | |||
|
279 | log.info(filter_settings) | |||
|
280 | paginator = paginate.Page(ordered_ids, items_per_page=items_per_page, | |||
|
281 | **filter_settings) | |||
|
282 | sa_items = () | |||
|
283 | if paginator.items: | |||
|
284 | db_session = get_db_session(db_session) | |||
|
285 | # latest report detail | |||
|
286 | query = db_session.query(Report) | |||
|
287 | query = query.options(sa.orm.joinedload(Report.report_group)) | |||
|
288 | query = query.filter(Report.id.in_(paginator.items)) | |||
|
289 | if filter_settings.get('order_col'): | |||
|
290 | order_col = filter_settings.get('order_col') | |||
|
291 | if filter_settings.get('order_dir') == 'dsc': | |||
|
292 | sort_on = 'desc' | |||
|
293 | else: | |||
|
294 | sort_on = 'asc' | |||
|
295 | if order_col == 'when': | |||
|
296 | order_col = 'last_timestamp' | |||
|
297 | query = query.order_by(getattr(sa, sort_on)( | |||
|
298 | getattr(ReportGroup, order_col))) | |||
|
299 | sa_items = query.all() | |||
|
300 | sorted_instance_list = [] | |||
|
301 | for i_id in ordered_ids: | |||
|
302 | for report in sa_items: | |||
|
303 | if (str(report.id) == i_id and | |||
|
304 | report not in sorted_instance_list): | |||
|
305 | sorted_instance_list.append(report) | |||
|
306 | paginator.sa_items = sorted_instance_list | |||
|
307 | return paginator | |||
|
308 | ||||
|
309 | @classmethod | |||
|
310 | def by_app_ids(cls, app_ids=None, order_by=True, db_session=None): | |||
|
311 | db_session = get_db_session(db_session) | |||
|
312 | q = db_session.query(ReportGroup) | |||
|
313 | if app_ids: | |||
|
314 | q = q.filter(ReportGroup.resource_id.in_(app_ids)) | |||
|
315 | if order_by: | |||
|
316 | q = q.order_by(sa.desc(ReportGroup.id)) | |||
|
317 | return q | |||
|
318 | ||||
|
319 | @classmethod | |||
|
320 | def by_id(cls, group_id, app_ids=None, db_session=None): | |||
|
321 | db_session = get_db_session(db_session) | |||
|
322 | q = db_session.query(ReportGroup).filter( | |||
|
323 | ReportGroup.id == int(group_id)) | |||
|
324 | if app_ids: | |||
|
325 | q = q.filter(ReportGroup.resource_id.in_(app_ids)) | |||
|
326 | return q.first() | |||
|
327 | ||||
|
328 | @classmethod | |||
|
329 | def by_ids(cls, group_ids=None, db_session=None): | |||
|
330 | db_session = get_db_session(db_session) | |||
|
331 | query = db_session.query(ReportGroup) | |||
|
332 | query = query.filter(ReportGroup.id.in_(group_ids)) | |||
|
333 | return query | |||
|
334 | ||||
|
335 | @classmethod | |||
|
336 | def by_hash_and_resource(self, resource_id, | |||
|
337 | grouping_hash, db_session=None): | |||
|
338 | db_session = get_db_session(db_session) | |||
|
339 | q = db_session.query(ReportGroup) | |||
|
340 | q = q.filter(ReportGroup.resource_id == resource_id) | |||
|
341 | q = q.filter(ReportGroup.grouping_hash == grouping_hash) | |||
|
342 | q = q.filter(ReportGroup.fixed == False) | |||
|
343 | return q.first() | |||
|
344 | ||||
|
345 | @classmethod | |||
|
346 | def users_commenting(cls, report_group, exclude_user_id=None, | |||
|
347 | db_session=None): | |||
|
348 | db_session = get_db_session(None, report_group) | |||
|
349 | query = db_session.query(User).distinct() | |||
|
350 | query = query.filter(User.id == ReportComment.owner_id) | |||
|
351 | query = query.filter(ReportComment.group_id == report_group.id) | |||
|
352 | if exclude_user_id: | |||
|
353 | query = query.filter(ReportComment.owner_id != exclude_user_id) | |||
|
354 | return query | |||
|
355 | ||||
|
356 | @classmethod | |||
|
357 | def affected_users_count(cls, report_group, db_session=None): | |||
|
358 | db_session = get_db_session(db_session) | |||
|
359 | query = db_session.query(sa.func.count(Report.username)) | |||
|
360 | query = query.filter(Report.group_id == report_group.id) | |||
|
361 | query = query.filter(Report.username != '') | |||
|
362 | query = query.filter(Report.username != None) | |||
|
363 | query = query.group_by(Report.username) | |||
|
364 | return query.count() | |||
|
365 | ||||
|
366 | @classmethod | |||
|
367 | def top_affected_users(cls, report_group, db_session=None): | |||
|
368 | db_session = get_db_session(db_session) | |||
|
369 | count_label = sa.func.count(Report.username).label('count') | |||
|
370 | query = db_session.query(Report.username, count_label) | |||
|
371 | query = query.filter(Report.group_id == report_group.id) | |||
|
372 | query = query.filter(Report.username != None) | |||
|
373 | query = query.filter(Report.username != '') | |||
|
374 | query = query.group_by(Report.username) | |||
|
375 | query = query.order_by(sa.desc(count_label)) | |||
|
376 | query = query.limit(50) | |||
|
377 | return query | |||
|
378 | ||||
|
379 | @classmethod | |||
|
380 | def get_report_stats(cls, request, filter_settings): | |||
|
381 | """ | |||
|
382 | Gets report dashboard graphs | |||
|
383 | Returns information for BAR charts with occurences/interval information | |||
|
384 | detailed means version that returns time intervals - non detailed | |||
|
385 | returns total sum | |||
|
386 | """ | |||
|
387 | delta = filter_settings['end_date'] - filter_settings['start_date'] | |||
|
388 | if delta < h.time_deltas.get('12h')['delta']: | |||
|
389 | interval = '1m' | |||
|
390 | elif delta <= h.time_deltas.get('3d')['delta']: | |||
|
391 | interval = '5m' | |||
|
392 | elif delta >= h.time_deltas.get('2w')['delta']: | |||
|
393 | interval = '24h' | |||
|
394 | else: | |||
|
395 | interval = '1h' | |||
|
396 | ||||
|
397 | group_id = filter_settings.get('group_id') | |||
|
398 | ||||
|
399 | es_query = { | |||
|
400 | 'aggs': {'parent_agg': {'aggs': {'types': { | |||
|
401 | 'aggs': {'sub_agg': {'terms': {'field': 'tags.type.values'}}}, | |||
|
402 | 'filter': { | |||
|
403 | 'and': [{'exists': {'field': 'tags.type.values'}}]} | |||
|
404 | }}, | |||
|
405 | 'date_histogram': {'extended_bounds': { | |||
|
406 | 'max': filter_settings['end_date'], | |||
|
407 | 'min': filter_settings['start_date']}, | |||
|
408 | 'field': 'timestamp', | |||
|
409 | 'interval': interval, | |||
|
410 | 'min_doc_count': 0}}}, | |||
|
411 | 'query': {'filtered': { | |||
|
412 | 'filter': {'and': [ | |||
|
413 | {'terms': { | |||
|
414 | 'resource_id': [filter_settings['resource'][0]]}}, | |||
|
415 | {'range': {'timestamp': { | |||
|
416 | 'gte': filter_settings['start_date'], | |||
|
417 | 'lte': filter_settings['end_date']}}}] | |||
|
418 | } | |||
|
419 | }} | |||
|
420 | } | |||
|
421 | if group_id: | |||
|
422 | parent_agg = es_query['aggs']['parent_agg'] | |||
|
423 | filters = parent_agg['aggs']['types']['filter']['and'] | |||
|
424 | filters.append({'terms': {'tags.group_id.values': [group_id]}}) | |||
|
425 | ||||
|
426 | index_names = es_index_name_limiter( | |||
|
427 | start_date=filter_settings['start_date'], | |||
|
428 | end_date=filter_settings['end_date'], | |||
|
429 | ixtypes=['reports']) | |||
|
430 | ||||
|
431 | if not index_names: | |||
|
432 | return [] | |||
|
433 | ||||
|
434 | result = Datastores.es.search(es_query, | |||
|
435 | index=index_names, | |||
|
436 | doc_type='log', | |||
|
437 | size=0) | |||
|
438 | series = [] | |||
|
439 | for bucket in result['aggregations']['parent_agg']['buckets']: | |||
|
440 | point = { | |||
|
441 | 'x': datetime.utcfromtimestamp(int(bucket['key']) / 1000), | |||
|
442 | 'report': 0, | |||
|
443 | 'not_found': 0, | |||
|
444 | 'slow_report': 0 | |||
|
445 | } | |||
|
446 | for subbucket in bucket['types']['sub_agg']['buckets']: | |||
|
447 | if subbucket['key'] == 'slow': | |||
|
448 | point['slow_report'] = subbucket['doc_count'] | |||
|
449 | elif subbucket['key'] == 'error': | |||
|
450 | point['report'] = subbucket['doc_count'] | |||
|
451 | elif subbucket['key'] == 'not_found': | |||
|
452 | point['not_found'] = subbucket['doc_count'] | |||
|
453 | series.append(point) | |||
|
454 | return series |
@@ -0,0 +1,55 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from appenlight.models import Datastores | |||
|
23 | from appenlight.models.services.base import BaseService | |||
|
24 | from appenlight.lib.enums import ReportType | |||
|
25 | from appenlight.lib.utils import es_index_name_limiter | |||
|
26 | ||||
|
27 | ||||
|
28 | class ReportStatService(BaseService): | |||
|
29 | @classmethod | |||
|
30 | def count_by_type(cls, report_type, resource_id, since_when): | |||
|
31 | report_type = ReportType.key_from_value(report_type) | |||
|
32 | ||||
|
33 | index_names = es_index_name_limiter(start_date=since_when, | |||
|
34 | ixtypes=['reports']) | |||
|
35 | ||||
|
36 | es_query = { | |||
|
37 | 'aggs': {'reports': {'aggs': { | |||
|
38 | 'sub_agg': {'value_count': {'field': 'tags.group_id.values'}}}, | |||
|
39 | 'filter': {'and': [{'terms': {'resource_id': [resource_id]}}, | |||
|
40 | {'exists': { | |||
|
41 | 'field': 'tags.group_id.values'}}]}}}, | |||
|
42 | 'query': {'filtered': {'filter': { | |||
|
43 | 'and': [{'terms': {'resource_id': [resource_id]}}, | |||
|
44 | {'terms': {'tags.type.values': [report_type]}}, | |||
|
45 | {'range': {'timestamp': { | |||
|
46 | 'gte': since_when}}}]}}}} | |||
|
47 | ||||
|
48 | if index_names: | |||
|
49 | result = Datastores.es.search(es_query, | |||
|
50 | index=index_names, | |||
|
51 | doc_type='log', | |||
|
52 | size=0) | |||
|
53 | return result['aggregations']['reports']['sub_agg']['value'] | |||
|
54 | else: | |||
|
55 | return 0 |
@@ -0,0 +1,448 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from datetime import datetime | |||
|
23 | ||||
|
24 | import appenlight.lib.helpers as h | |||
|
25 | from appenlight.models import get_db_session, Datastores | |||
|
26 | from appenlight.models.services.base import BaseService | |||
|
27 | from appenlight.lib.enums import ReportType | |||
|
28 | from appenlight.lib.utils import es_index_name_limiter | |||
|
29 | ||||
|
30 | try: | |||
|
31 | from ae_uptime_ce.models.services.uptime_metric import \ | |||
|
32 | UptimeMetricService | |||
|
33 | except ImportError: | |||
|
34 | UptimeMetricService = None | |||
|
35 | ||||
|
36 | ||||
|
37 | def check_key(key, stats, uptime, total_seconds): | |||
|
38 | if key not in stats: | |||
|
39 | stats[key] = {'name': key, | |||
|
40 | 'requests': 0, | |||
|
41 | 'errors': 0, | |||
|
42 | 'tolerated_requests': 0, | |||
|
43 | 'frustrating_requests': 0, | |||
|
44 | 'satisfying_requests': 0, | |||
|
45 | 'total_minutes': total_seconds / 60.0, | |||
|
46 | 'uptime': uptime, | |||
|
47 | 'apdex': 0, | |||
|
48 | 'rpm': 0, | |||
|
49 | 'response_time': 0, | |||
|
50 | 'avg_response_time': 0} | |||
|
51 | ||||
|
52 | ||||
|
53 | class RequestMetricService(BaseService): | |||
|
54 | @classmethod | |||
|
55 | def get_metrics_stats(cls, request, filter_settings, db_session=None): | |||
|
56 | delta = filter_settings['end_date'] - filter_settings['start_date'] | |||
|
57 | if delta < h.time_deltas.get('12h')['delta']: | |||
|
58 | interval = '1m' | |||
|
59 | elif delta <= h.time_deltas.get('3d')['delta']: | |||
|
60 | interval = '5m' | |||
|
61 | elif delta >= h.time_deltas.get('2w')['delta']: | |||
|
62 | interval = '24h' | |||
|
63 | else: | |||
|
64 | interval = '1h' | |||
|
65 | ||||
|
66 | filter_settings['namespace'] = ['appenlight.request_metric'] | |||
|
67 | ||||
|
68 | es_query = { | |||
|
69 | 'aggs': { | |||
|
70 | 'parent_agg': { | |||
|
71 | 'aggs': {'custom': {'aggs': {'sub_agg': { | |||
|
72 | 'sum': {'field': 'tags.custom.numeric_values'}}}, | |||
|
73 | 'filter': {'exists': { | |||
|
74 | 'field': 'tags.custom.numeric_values'}}}, | |||
|
75 | 'main': {'aggs': {'sub_agg': {'sum': { | |||
|
76 | 'field': 'tags.main.numeric_values'}}}, | |||
|
77 | 'filter': {'exists': { | |||
|
78 | 'field': 'tags.main.numeric_values'}}}, | |||
|
79 | 'nosql': {'aggs': {'sub_agg': {'sum': { | |||
|
80 | 'field': 'tags.nosql.numeric_values'}}}, | |||
|
81 | 'filter': {'exists': { | |||
|
82 | 'field': 'tags.nosql.numeric_values'}}}, | |||
|
83 | 'remote': {'aggs': {'sub_agg': {'sum': { | |||
|
84 | 'field': 'tags.remote.numeric_values'}}}, | |||
|
85 | 'filter': {'exists': { | |||
|
86 | 'field': 'tags.remote.numeric_values'}}}, | |||
|
87 | 'requests': {'aggs': {'sub_agg': {'sum': { | |||
|
88 | 'field': 'tags.requests.numeric_values'}}}, | |||
|
89 | 'filter': {'exists': { | |||
|
90 | 'field': 'tags.requests.numeric_values'}}}, | |||
|
91 | 'sql': {'aggs': {'sub_agg': { | |||
|
92 | 'sum': {'field': 'tags.sql.numeric_values'}}}, | |||
|
93 | 'filter': {'exists': { | |||
|
94 | 'field': 'tags.sql.numeric_values'}}}, | |||
|
95 | 'tmpl': {'aggs': {'sub_agg': {'sum': { | |||
|
96 | 'field': 'tags.tmpl.numeric_values'}}}, | |||
|
97 | 'filter': {'exists': { | |||
|
98 | 'field': 'tags.tmpl.numeric_values'}}}}, | |||
|
99 | 'date_histogram': {'extended_bounds': { | |||
|
100 | 'max': filter_settings['end_date'], | |||
|
101 | 'min': filter_settings['start_date']}, | |||
|
102 | 'field': 'timestamp', | |||
|
103 | 'interval': interval, | |||
|
104 | 'min_doc_count': 0}}}, | |||
|
105 | 'query': {'filtered': { | |||
|
106 | 'filter': {'and': [{'terms': { | |||
|
107 | 'resource_id': [filter_settings['resource'][0]]}}, | |||
|
108 | {'range': {'timestamp': { | |||
|
109 | 'gte': filter_settings['start_date'], | |||
|
110 | 'lte': filter_settings['end_date']}}}, | |||
|
111 | {'terms': {'namespace': [ | |||
|
112 | 'appenlight.request_metric']}}]}}}} | |||
|
113 | ||||
|
114 | index_names = es_index_name_limiter( | |||
|
115 | start_date=filter_settings['start_date'], | |||
|
116 | end_date=filter_settings['end_date'], | |||
|
117 | ixtypes=['metrics']) | |||
|
118 | if not index_names: | |||
|
119 | return [] | |||
|
120 | ||||
|
121 | result = Datastores.es.search(es_query, | |||
|
122 | index=index_names, | |||
|
123 | doc_type='log', | |||
|
124 | size=0) | |||
|
125 | ||||
|
126 | plot_data = [] | |||
|
127 | for item in result['aggregations']['parent_agg']['buckets']: | |||
|
128 | x_time = datetime.utcfromtimestamp(int(item['key']) / 1000) | |||
|
129 | point = {"x": x_time} | |||
|
130 | for key in ['custom', 'main', 'nosql', 'remote', | |||
|
131 | 'requests', 'sql', 'tmpl']: | |||
|
132 | value = item[key]['sub_agg']['value'] | |||
|
133 | point[key] = round(value, 3) if value else 0 | |||
|
134 | plot_data.append(point) | |||
|
135 | ||||
|
136 | return plot_data | |||
|
137 | ||||
|
138 | @classmethod | |||
|
139 | def get_requests_breakdown(cls, request, filter_settings, | |||
|
140 | db_session=None): | |||
|
141 | db_session = get_db_session(db_session) | |||
|
142 | ||||
|
143 | # fetch total time of all requests in this time range | |||
|
144 | index_names = es_index_name_limiter( | |||
|
145 | start_date=filter_settings['start_date'], | |||
|
146 | end_date=filter_settings['end_date'], | |||
|
147 | ixtypes=['metrics']) | |||
|
148 | ||||
|
149 | if index_names and filter_settings['resource']: | |||
|
150 | es_query = { | |||
|
151 | 'aggs': {'main': {'aggs': { | |||
|
152 | 'sub_agg': {'sum': {'field': 'tags.main.numeric_values'}}}, | |||
|
153 | 'filter': {'exists': { | |||
|
154 | 'field': 'tags.main.numeric_values'}}}}, | |||
|
155 | 'query': {'filtered': { | |||
|
156 | 'filter': {'and': [ | |||
|
157 | {'terms': { | |||
|
158 | 'resource_id': [filter_settings['resource'][0]]}}, | |||
|
159 | {'range': {'timestamp': { | |||
|
160 | 'gte': filter_settings['start_date'], | |||
|
161 | 'lte': filter_settings['end_date']}}}, | |||
|
162 | {'terms': {'namespace': [ | |||
|
163 | 'appenlight.request_metric']}}]}}}} | |||
|
164 | result = Datastores.es.search(es_query, | |||
|
165 | index=index_names, | |||
|
166 | doc_type='log', | |||
|
167 | size=0) | |||
|
168 | total_time_spent = result['aggregations']['main']['sub_agg'][ | |||
|
169 | 'value'] | |||
|
170 | else: | |||
|
171 | total_time_spent = 0 | |||
|
172 | script_text = "doc['tags.main.numeric_values'].value / {}".format( | |||
|
173 | total_time_spent) | |||
|
174 | ||||
|
175 | if index_names and filter_settings['resource']: | |||
|
176 | es_query = { | |||
|
177 | 'aggs': { | |||
|
178 | 'parent_agg': { | |||
|
179 | 'aggs': {'main': {'aggs': { | |||
|
180 | 'sub_agg': { | |||
|
181 | 'sum': {'field': 'tags.main.numeric_values'}}}, | |||
|
182 | 'filter': { | |||
|
183 | 'exists': { | |||
|
184 | 'field': 'tags.main.numeric_values'}}}, | |||
|
185 | 'percentage': { | |||
|
186 | 'aggs': {'sub_agg': { | |||
|
187 | 'sum': { | |||
|
188 | 'lang': 'expression', | |||
|
189 | 'script': script_text}}}, | |||
|
190 | 'filter': { | |||
|
191 | 'exists': { | |||
|
192 | 'field': 'tags.main.numeric_values'}}}, | |||
|
193 | 'requests': {'aggs': {'sub_agg': { | |||
|
194 | 'sum': { | |||
|
195 | 'field': 'tags.requests.numeric_values'}}}, | |||
|
196 | 'filter': {'exists': { | |||
|
197 | 'field': 'tags.requests.numeric_values'}}}}, | |||
|
198 | 'terms': {'field': 'tags.view_name.values', | |||
|
199 | 'order': {'percentage>sub_agg': 'desc'}, | |||
|
200 | 'size': 15}}}, | |||
|
201 | 'query': {'filtered': {'filter': {'and': [ | |||
|
202 | {'terms': { | |||
|
203 | 'resource_id': [filter_settings['resource'][0]]}}, | |||
|
204 | {'range': { | |||
|
205 | 'timestamp': {'gte': filter_settings['start_date'], | |||
|
206 | 'lte': filter_settings['end_date'] | |||
|
207 | } | |||
|
208 | } | |||
|
209 | } | |||
|
210 | ]} | |||
|
211 | }} | |||
|
212 | } | |||
|
213 | result = Datastores.es.search(es_query, | |||
|
214 | index=index_names, | |||
|
215 | doc_type='log', | |||
|
216 | size=0) | |||
|
217 | series = result['aggregations']['parent_agg']['buckets'] | |||
|
218 | else: | |||
|
219 | series = [] | |||
|
220 | ||||
|
221 | and_part = [ | |||
|
222 | {"term": {"resource_id": filter_settings['resource'][0]}}, | |||
|
223 | {"terms": {"tags.view_name.values": [row['key'] for | |||
|
224 | row in series]}}, | |||
|
225 | {"term": {"report_type": str(ReportType.slow)}} | |||
|
226 | ] | |||
|
227 | query = { | |||
|
228 | "aggs": { | |||
|
229 | "top_reports": { | |||
|
230 | "terms": { | |||
|
231 | "field": "tags.view_name.values", | |||
|
232 | "size": len(series) | |||
|
233 | }, | |||
|
234 | "aggs": { | |||
|
235 | "top_calls_hits": { | |||
|
236 | "top_hits": { | |||
|
237 | "sort": {"start_time": "desc"}, | |||
|
238 | "size": 5 | |||
|
239 | } | |||
|
240 | } | |||
|
241 | } | |||
|
242 | } | |||
|
243 | }, | |||
|
244 | ||||
|
245 | "query": { | |||
|
246 | "filtered": { | |||
|
247 | "filter": { | |||
|
248 | "and": and_part | |||
|
249 | } | |||
|
250 | } | |||
|
251 | } | |||
|
252 | } | |||
|
253 | details = {} | |||
|
254 | index_names = es_index_name_limiter(ixtypes=['reports']) | |||
|
255 | if index_names and series: | |||
|
256 | result = Datastores.es.search( | |||
|
257 | query, doc_type='report', size=0, index=index_names) | |||
|
258 | for bucket in result['aggregations']['top_reports']['buckets']: | |||
|
259 | details[bucket['key']] = [] | |||
|
260 | ||||
|
261 | for hit in bucket['top_calls_hits']['hits']['hits']: | |||
|
262 | details[bucket['key']].append( | |||
|
263 | {'report_id': hit['_source']['pg_id'], | |||
|
264 | 'group_id': hit['_source']['group_id']} | |||
|
265 | ) | |||
|
266 | ||||
|
267 | results = [] | |||
|
268 | for row in series: | |||
|
269 | result = { | |||
|
270 | 'key': row['key'], | |||
|
271 | 'main': row['main']['sub_agg']['value'], | |||
|
272 | 'requests': row['requests']['sub_agg']['value'] | |||
|
273 | } | |||
|
274 | # es can return 'infinity' | |||
|
275 | try: | |||
|
276 | result['percentage'] = float( | |||
|
277 | row['percentage']['sub_agg']['value']) | |||
|
278 | except ValueError: | |||
|
279 | result['percentage'] = 0 | |||
|
280 | ||||
|
281 | result['latest_details'] = details.get(row['key']) or [] | |||
|
282 | results.append(result) | |||
|
283 | ||||
|
284 | return results | |||
|
285 | ||||
|
286 | @classmethod | |||
|
287 | def get_apdex_stats(cls, request, filter_settings, | |||
|
288 | threshold=1, db_session=None): | |||
|
289 | """ | |||
|
290 | Returns information and calculates APDEX score per server for dashboard | |||
|
291 | server information (upper right stats boxes) | |||
|
292 | """ | |||
|
293 | # Apdex t = (Satisfied Count + Tolerated Count / 2) / Total Samples | |||
|
294 | db_session = get_db_session(db_session) | |||
|
295 | index_names = es_index_name_limiter( | |||
|
296 | start_date=filter_settings['start_date'], | |||
|
297 | end_date=filter_settings['end_date'], ixtypes=['metrics']) | |||
|
298 | ||||
|
299 | requests_series = [] | |||
|
300 | ||||
|
301 | if index_names and filter_settings['resource']: | |||
|
302 | es_query = { | |||
|
303 | 'aggs': { | |||
|
304 | 'parent_agg': {'aggs': { | |||
|
305 | 'frustrating': {'aggs': {'sub_agg': { | |||
|
306 | 'sum': {'field': 'tags.requests.numeric_values'}}}, | |||
|
307 | 'filter': {'and': [ | |||
|
308 | {'range': { | |||
|
309 | 'tags.main.numeric_values': {'gte': '4'}}}, | |||
|
310 | {'exists': { | |||
|
311 | 'field': 'tags.requests.numeric_values'}}] | |||
|
312 | } | |||
|
313 | }, | |||
|
314 | 'main': {'aggs': {'sub_agg': {'sum': { | |||
|
315 | 'field': 'tags.main.numeric_values'}}}, | |||
|
316 | 'filter': {'exists': { | |||
|
317 | 'field': 'tags.main.numeric_values'}}}, | |||
|
318 | 'requests': {'aggs': {'sub_agg': { | |||
|
319 | 'sum': { | |||
|
320 | 'field': 'tags.requests.numeric_values'}}}, | |||
|
321 | 'filter': {'exists': { | |||
|
322 | 'field': 'tags.requests.numeric_values'}}}, | |||
|
323 | 'tolerated': {'aggs': {'sub_agg': { | |||
|
324 | 'sum': { | |||
|
325 | 'field': 'tags.requests.numeric_values'}}}, | |||
|
326 | 'filter': {'and': [ | |||
|
327 | {'range': { | |||
|
328 | 'tags.main.numeric_values': {'gte': '1'}}}, | |||
|
329 | {'range': { | |||
|
330 | 'tags.main.numeric_values': {'lt': '4'}}}, | |||
|
331 | {'exists': { | |||
|
332 | 'field': 'tags.requests.numeric_values'}}]} | |||
|
333 | } | |||
|
334 | }, | |||
|
335 | 'terms': {'field': 'tags.server_name.values', | |||
|
336 | 'size': 999999}}}, | |||
|
337 | 'query': { | |||
|
338 | 'filtered': { | |||
|
339 | 'filter': {'and': [{'terms': { | |||
|
340 | 'resource_id': [ | |||
|
341 | filter_settings['resource'][0]]}}, | |||
|
342 | {'range': {'timestamp': { | |||
|
343 | 'gte': filter_settings['start_date'], | |||
|
344 | 'lte': filter_settings['end_date']}}}, | |||
|
345 | {'terms': {'namespace': [ | |||
|
346 | 'appenlight.request_metric']}}]}}}} | |||
|
347 | ||||
|
348 | result = Datastores.es.search(es_query, | |||
|
349 | index=index_names, | |||
|
350 | doc_type='log', | |||
|
351 | size=0) | |||
|
352 | for bucket in result['aggregations']['parent_agg']['buckets']: | |||
|
353 | requests_series.append({ | |||
|
354 | 'frustrating': bucket['frustrating']['sub_agg']['value'], | |||
|
355 | 'main': bucket['main']['sub_agg']['value'], | |||
|
356 | 'requests': bucket['requests']['sub_agg']['value'], | |||
|
357 | 'tolerated': bucket['tolerated']['sub_agg']['value'], | |||
|
358 | 'key': bucket['key'] | |||
|
359 | }) | |||
|
360 | ||||
|
361 | since_when = filter_settings['start_date'] | |||
|
362 | until = filter_settings['end_date'] | |||
|
363 | ||||
|
364 | # total errors | |||
|
365 | ||||
|
366 | index_names = es_index_name_limiter( | |||
|
367 | start_date=filter_settings['start_date'], | |||
|
368 | end_date=filter_settings['end_date'], ixtypes=['reports']) | |||
|
369 | ||||
|
370 | report_series = [] | |||
|
371 | if index_names and filter_settings['resource']: | |||
|
372 | report_type = ReportType.key_from_value(ReportType.error) | |||
|
373 | es_query = { | |||
|
374 | 'aggs': { | |||
|
375 | 'parent_agg': {'aggs': {'errors': {'aggs': {'sub_agg': { | |||
|
376 | 'sum': { | |||
|
377 | 'field': 'tags.occurences.numeric_values'}}}, | |||
|
378 | 'filter': {'and': [ | |||
|
379 | {'terms': { | |||
|
380 | 'tags.type.values': [report_type]}}, | |||
|
381 | {'exists': { | |||
|
382 | 'field': 'tags.occurences.numeric_values'}}] | |||
|
383 | } | |||
|
384 | }}, | |||
|
385 | 'terms': {'field': 'tags.server_name.values', | |||
|
386 | 'size': 999999}}}, | |||
|
387 | 'query': {'filtered': { | |||
|
388 | 'filter': {'and': [ | |||
|
389 | {'terms': { | |||
|
390 | 'resource_id': [filter_settings['resource'][0]]}}, | |||
|
391 | {'range': { | |||
|
392 | 'timestamp': {'gte': filter_settings['start_date'], | |||
|
393 | 'lte': filter_settings['end_date']}} | |||
|
394 | }, | |||
|
395 | {'terms': {'namespace': ['appenlight.error']}}] | |||
|
396 | } | |||
|
397 | }} | |||
|
398 | } | |||
|
399 | result = Datastores.es.search(es_query, | |||
|
400 | index=index_names, | |||
|
401 | doc_type='log', | |||
|
402 | size=0) | |||
|
403 | for bucket in result['aggregations']['parent_agg']['buckets']: | |||
|
404 | report_series.append( | |||
|
405 | {'key': bucket['key'], | |||
|
406 | 'errors': bucket['errors']['sub_agg']['value'] | |||
|
407 | } | |||
|
408 | ) | |||
|
409 | ||||
|
410 | stats = {} | |||
|
411 | if UptimeMetricService is not None: | |||
|
412 | uptime = UptimeMetricService.get_uptime_by_app( | |||
|
413 | filter_settings['resource'][0], | |||
|
414 | since_when=since_when, until=until) | |||
|
415 | else: | |||
|
416 | uptime = 0 | |||
|
417 | ||||
|
418 | total_seconds = (until - since_when).total_seconds() | |||
|
419 | ||||
|
420 | for stat in requests_series: | |||
|
421 | check_key(stat['key'], stats, uptime, total_seconds) | |||
|
422 | stats[stat['key']]['requests'] = int(stat['requests']) | |||
|
423 | stats[stat['key']]['response_time'] = stat['main'] | |||
|
424 | stats[stat['key']]['tolerated_requests'] = stat['tolerated'] | |||
|
425 | stats[stat['key']]['frustrating_requests'] = stat['frustrating'] | |||
|
426 | for server in report_series: | |||
|
427 | check_key(server['key'], stats, uptime, total_seconds) | |||
|
428 | stats[server['key']]['errors'] = server['errors'] | |||
|
429 | ||||
|
430 | server_stats = list(stats.values()) | |||
|
431 | for stat in server_stats: | |||
|
432 | stat['satisfying_requests'] = stat['requests'] - stat['errors'] \ | |||
|
433 | - stat['frustrating_requests'] - \ | |||
|
434 | stat['tolerated_requests'] | |||
|
435 | if stat['satisfying_requests'] < 0: | |||
|
436 | stat['satisfying_requests'] = 0 | |||
|
437 | ||||
|
438 | if stat['requests']: | |||
|
439 | stat['avg_response_time'] = round(stat['response_time'] / | |||
|
440 | stat['requests'], 3) | |||
|
441 | qual_requests = stat['satisfying_requests'] + \ | |||
|
442 | stat['tolerated_requests'] / 2.0 | |||
|
443 | stat['apdex'] = round((qual_requests / stat['requests']) * 100, | |||
|
444 | 2) | |||
|
445 | stat['rpm'] = round(stat['requests'] / stat['total_minutes'], | |||
|
446 | 2) | |||
|
447 | ||||
|
448 | return sorted(server_stats, key=lambda x: x['name']) |
@@ -0,0 +1,174 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from appenlight.models import get_db_session, Datastores | |||
|
23 | from appenlight.models.report import Report | |||
|
24 | from appenlight.models.services.base import BaseService | |||
|
25 | from appenlight.lib.utils import es_index_name_limiter | |||
|
26 | ||||
|
27 | ||||
|
28 | class SlowCallService(BaseService): | |||
|
29 | @classmethod | |||
|
30 | def get_time_consuming_calls(cls, request, filter_settings, | |||
|
31 | db_session=None): | |||
|
32 | db_session = get_db_session(db_session) | |||
|
33 | # get slow calls from older partitions too | |||
|
34 | index_names = es_index_name_limiter( | |||
|
35 | start_date=filter_settings['start_date'], | |||
|
36 | end_date=filter_settings['end_date'], | |||
|
37 | ixtypes=['slow_calls']) | |||
|
38 | if index_names and filter_settings['resource']: | |||
|
39 | # get longest time taking hashes | |||
|
40 | es_query = { | |||
|
41 | 'aggs': { | |||
|
42 | 'parent_agg': { | |||
|
43 | 'aggs': { | |||
|
44 | 'duration': { | |||
|
45 | 'aggs': {'sub_agg': { | |||
|
46 | 'sum': { | |||
|
47 | 'field': 'tags.duration.numeric_values'} | |||
|
48 | }}, | |||
|
49 | 'filter': {'exists': { | |||
|
50 | 'field': 'tags.duration.numeric_values'}}}, | |||
|
51 | 'total': { | |||
|
52 | 'aggs': {'sub_agg': {'value_count': { | |||
|
53 | 'field': 'tags.statement_hash.values'}}}, | |||
|
54 | 'filter': {'exists': { | |||
|
55 | 'field': 'tags.statement_hash.values'}}}}, | |||
|
56 | 'terms': {'field': 'tags.statement_hash.values', | |||
|
57 | 'order': {'duration>sub_agg': 'desc'}, | |||
|
58 | 'size': 15}}}, | |||
|
59 | 'query': {'filtered': { | |||
|
60 | 'filter': {'and': [ | |||
|
61 | {'terms': { | |||
|
62 | 'resource_id': [filter_settings['resource'][0]] | |||
|
63 | }}, | |||
|
64 | {'range': {'timestamp': { | |||
|
65 | 'gte': filter_settings['start_date'], | |||
|
66 | 'lte': filter_settings['end_date']} | |||
|
67 | }}] | |||
|
68 | } | |||
|
69 | } | |||
|
70 | } | |||
|
71 | } | |||
|
72 | result = Datastores.es.search( | |||
|
73 | es_query, index=index_names, doc_type='log', size=0) | |||
|
74 | results = result['aggregations']['parent_agg']['buckets'] | |||
|
75 | else: | |||
|
76 | return [] | |||
|
77 | hashes = [i['key'] for i in results] | |||
|
78 | ||||
|
79 | # get queries associated with hashes | |||
|
80 | calls_query = { | |||
|
81 | "aggs": { | |||
|
82 | "top_calls": { | |||
|
83 | "terms": { | |||
|
84 | "field": "tags.statement_hash.values", | |||
|
85 | "size": 15 | |||
|
86 | }, | |||
|
87 | "aggs": { | |||
|
88 | "top_calls_hits": { | |||
|
89 | "top_hits": { | |||
|
90 | "sort": {"timestamp": "desc"}, | |||
|
91 | "size": 5 | |||
|
92 | } | |||
|
93 | } | |||
|
94 | } | |||
|
95 | } | |||
|
96 | }, | |||
|
97 | "query": { | |||
|
98 | "filtered": { | |||
|
99 | "filter": { | |||
|
100 | "and": [ | |||
|
101 | { | |||
|
102 | "terms": { | |||
|
103 | "resource_id": [ | |||
|
104 | filter_settings['resource'][0] | |||
|
105 | ] | |||
|
106 | } | |||
|
107 | }, | |||
|
108 | { | |||
|
109 | "terms": { | |||
|
110 | "tags.statement_hash.values": hashes | |||
|
111 | } | |||
|
112 | }, | |||
|
113 | { | |||
|
114 | "range": { | |||
|
115 | "timestamp": { | |||
|
116 | "gte": filter_settings['start_date'], | |||
|
117 | "lte": filter_settings['end_date'] | |||
|
118 | } | |||
|
119 | } | |||
|
120 | } | |||
|
121 | ] | |||
|
122 | } | |||
|
123 | } | |||
|
124 | } | |||
|
125 | } | |||
|
126 | calls = Datastores.es.search(calls_query, | |||
|
127 | index=index_names, | |||
|
128 | doc_type='log', | |||
|
129 | size=0) | |||
|
130 | call_results = {} | |||
|
131 | report_ids = [] | |||
|
132 | for call in calls['aggregations']['top_calls']['buckets']: | |||
|
133 | hits = call['top_calls_hits']['hits']['hits'] | |||
|
134 | call_results[call['key']] = [i['_source'] for i in hits] | |||
|
135 | report_ids.extend([i['_source']['tags']['report_id']['values'] | |||
|
136 | for i in hits]) | |||
|
137 | if report_ids: | |||
|
138 | r_query = db_session.query(Report.group_id, Report.id) | |||
|
139 | r_query = r_query.filter(Report.id.in_(report_ids)) | |||
|
140 | r_query = r_query.filter( | |||
|
141 | Report.start_time >= filter_settings['start_date']) | |||
|
142 | else: | |||
|
143 | r_query = [] | |||
|
144 | reports_reversed = {} | |||
|
145 | for report in r_query: | |||
|
146 | reports_reversed[report.id] = report.group_id | |||
|
147 | ||||
|
148 | final_results = [] | |||
|
149 | for item in results: | |||
|
150 | if item['key'] not in call_results: | |||
|
151 | continue | |||
|
152 | call = call_results[item['key']][0] | |||
|
153 | row = {'occurences': item['total']['sub_agg']['value'], | |||
|
154 | 'total_duration': round( | |||
|
155 | item['duration']['sub_agg']['value']), | |||
|
156 | 'statement': call['message'], | |||
|
157 | 'statement_type': call['tags']['type']['values'], | |||
|
158 | 'statement_subtype': call['tags']['subtype']['values'], | |||
|
159 | 'statement_hash': item['key'], | |||
|
160 | 'latest_details': []} | |||
|
161 | if row['statement_type'] in ['tmpl', ' remote']: | |||
|
162 | params = call['tags']['parameters']['values'] \ | |||
|
163 | if 'parameters' in call['tags'] else '' | |||
|
164 | row['statement'] = '{} ({})'.format(call['message'], params) | |||
|
165 | for call in call_results[item['key']]: | |||
|
166 | report_id = call['tags']['report_id']['values'] | |||
|
167 | group_id = reports_reversed.get(report_id) | |||
|
168 | if group_id: | |||
|
169 | row['latest_details'].append( | |||
|
170 | {'group_id': group_id, 'report_id': report_id}) | |||
|
171 | ||||
|
172 | final_results.append(row) | |||
|
173 | ||||
|
174 | return final_results |
@@ -0,0 +1,102 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import sqlalchemy as sa | |||
|
23 | from appenlight.lib.ext_json import json | |||
|
24 | from pyramid.threadlocal import get_current_registry | |||
|
25 | from appenlight.models.tag import Tag | |||
|
26 | from appenlight.models.services.base import BaseService | |||
|
27 | from appenlight.models import get_db_session | |||
|
28 | ||||
|
29 | ||||
|
30 | class TagService(BaseService): | |||
|
31 | @classmethod | |||
|
32 | def cut_name(cls, tag_name): | |||
|
33 | return tag_name[:32] | |||
|
34 | ||||
|
35 | @classmethod | |||
|
36 | def cut_value(cls, value): | |||
|
37 | if isinstance(value, str): | |||
|
38 | value = value[:128] | |||
|
39 | return value | |||
|
40 | ||||
|
41 | @classmethod | |||
|
42 | def by_resource_id_and_value(cls, resource_id, tag_name, value, | |||
|
43 | db_session=None, create_missing=True): | |||
|
44 | """ | |||
|
45 | Fetches tag and creates a new one if missing | |||
|
46 | """ | |||
|
47 | db_session = get_db_session(db_session) | |||
|
48 | registry = get_current_registry() | |||
|
49 | ||||
|
50 | @registry.cache_regions.redis_min_10.cache_on_arguments( | |||
|
51 | namespace='TagService.by_resource_id_and_value') | |||
|
52 | def cached(resource_id, tag_name, value): | |||
|
53 | reduced_name = cls.cut_name(tag_name.decode('utf8')) | |||
|
54 | reduced_value = cls.cut_value(value.decode('utf8')) | |||
|
55 | ||||
|
56 | query = db_session.query(Tag) | |||
|
57 | query = query.filter(Tag.resource_id == resource_id) | |||
|
58 | query = query.filter(Tag.name == reduced_name) | |||
|
59 | query = query.filter(sa.cast(Tag.value, sa.types.TEXT) == | |||
|
60 | sa.cast(json.dumps(reduced_value), | |||
|
61 | sa.types.TEXT)) | |||
|
62 | tag = query.first() | |||
|
63 | if tag: | |||
|
64 | db_session.expunge(tag) | |||
|
65 | return tag | |||
|
66 | ||||
|
67 | view = cached(resource_id, tag_name.encode('utf8'), | |||
|
68 | value.encode('utf8')) | |||
|
69 | if not view and create_missing: | |||
|
70 | view = cls.create_tag(resource_id, | |||
|
71 | cls.cut_name(tag_name), | |||
|
72 | cls.cut_value(value), | |||
|
73 | db_session) | |||
|
74 | cached.invalidate(resource_id, tag_name.encode('utf8'), | |||
|
75 | value.encode('utf8')) | |||
|
76 | return view | |||
|
77 | ||||
|
78 | @classmethod | |||
|
79 | def create_tag(cls, resource_id, tag_name, value, db_session=None): | |||
|
80 | ||||
|
81 | tag = Tag(resource_id=resource_id, | |||
|
82 | name=cls.cut_name(tag_name), | |||
|
83 | value=cls.cut_value(value)) | |||
|
84 | db_session = get_db_session(db_session) | |||
|
85 | db_session.add(tag) | |||
|
86 | db_session.flush() | |||
|
87 | return tag | |||
|
88 | ||||
|
89 | @classmethod | |||
|
90 | def by_tag_id(cls, tag_id, db_session=None): | |||
|
91 | db_session = get_db_session(db_session) | |||
|
92 | registry = get_current_registry() | |||
|
93 | ||||
|
94 | @registry.cache_regions.redis_min_10.cache_on_arguments( | |||
|
95 | namespace='TagService.by_tag_id') | |||
|
96 | def cached(tag_id): | |||
|
97 | tag = db_session.query(Tag).filter(Tag.id == tag_id).first() | |||
|
98 | if tag: | |||
|
99 | db_session.expunge(tag) | |||
|
100 | return tag | |||
|
101 | ||||
|
102 | return cached(tag_id) |
@@ -0,0 +1,158 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | import pyramid_mailer | |||
|
24 | import pyramid.renderers | |||
|
25 | import sqlalchemy as sa | |||
|
26 | ||||
|
27 | from collections import namedtuple | |||
|
28 | from datetime import datetime | |||
|
29 | ||||
|
30 | from appenlight.lib.rule import Rule | |||
|
31 | from appenlight.models import get_db_session | |||
|
32 | from appenlight.models.integrations import IntegrationException | |||
|
33 | from appenlight.models.report import REPORT_TYPE_MATRIX | |||
|
34 | from appenlight.models.user import User | |||
|
35 | from appenlight.models.services.base import BaseService | |||
|
36 | from paginate_sqlalchemy import SqlalchemyOrmPage | |||
|
37 | from pyramid.threadlocal import get_current_registry | |||
|
38 | ||||
|
39 | log = logging.getLogger(__name__) | |||
|
40 | ||||
|
41 | GroupOccurence = namedtuple('GroupOccurence', ['occurences', 'group']) | |||
|
42 | ||||
|
43 | ||||
|
44 | class UserService(BaseService): | |||
|
45 | @classmethod | |||
|
46 | def all(cls, db_session=None): | |||
|
47 | return get_db_session(db_session).query(User).order_by(User.user_name) | |||
|
48 | ||||
|
49 | @classmethod | |||
|
50 | def send_email(cls, request, recipients, variables, template, | |||
|
51 | immediately=False, silent=False): | |||
|
52 | html = pyramid.renderers.render(template, variables, request) | |||
|
53 | title = variables.get('email_title', | |||
|
54 | variables.get('title', "No Title")) | |||
|
55 | title = title.replace('\r', '').replace('\n', '') | |||
|
56 | sender = "{} <{}>".format( | |||
|
57 | request.registry.settings['mailing.from_name'], | |||
|
58 | request.registry.settings['mailing.from_email']) | |||
|
59 | message = pyramid_mailer.message.Message( | |||
|
60 | subject=title, sender=sender, recipients=recipients, html=html) | |||
|
61 | if immediately: | |||
|
62 | try: | |||
|
63 | request.registry.mailer.send_immediately(message) | |||
|
64 | except Exception as e: | |||
|
65 | log.warning('Exception %s' % e) | |||
|
66 | if not silent: | |||
|
67 | raise | |||
|
68 | else: | |||
|
69 | request.registry.mailer.send(message) | |||
|
70 | ||||
|
71 | @classmethod | |||
|
72 | def get_paginator(cls, page=1, item_count=None, items_per_page=50, | |||
|
73 | order_by=None, filter_settings=None, | |||
|
74 | exclude_columns=None, db_session=None): | |||
|
75 | registry = get_current_registry() | |||
|
76 | if not exclude_columns: | |||
|
77 | exclude_columns = [] | |||
|
78 | if not filter_settings: | |||
|
79 | filter_settings = {} | |||
|
80 | db_session = get_db_session(db_session) | |||
|
81 | q = db_session.query(User) | |||
|
82 | if filter_settings.get('order_col'): | |||
|
83 | order_col = filter_settings.get('order_col') | |||
|
84 | if filter_settings.get('order_dir') == 'dsc': | |||
|
85 | sort_on = 'desc' | |||
|
86 | else: | |||
|
87 | sort_on = 'asc' | |||
|
88 | q = q.order_by(getattr(sa, sort_on)(getattr(User, order_col))) | |||
|
89 | else: | |||
|
90 | q = q.order_by(sa.desc(User.registered_date)) | |||
|
91 | # remove urlgen or it never caches count | |||
|
92 | cache_params = dict(filter_settings) | |||
|
93 | cache_params.pop('url', None) | |||
|
94 | cache_params.pop('url_maker', None) | |||
|
95 | ||||
|
96 | @registry.cache_regions.redis_min_5.cache_on_arguments() | |||
|
97 | def estimate_users(cache_key): | |||
|
98 | o_q = q.order_by(False) | |||
|
99 | return o_q.count() | |||
|
100 | ||||
|
101 | item_count = estimate_users(cache_params) | |||
|
102 | # if the number of pages is low we may want to invalidate the count to | |||
|
103 | # provide 'real time' update - use case - | |||
|
104 | # errors just started to flow in | |||
|
105 | if item_count < 1000: | |||
|
106 | item_count = estimate_users.refresh(cache_params) | |||
|
107 | paginator = SqlalchemyOrmPage(q, page=page, | |||
|
108 | item_count=item_count, | |||
|
109 | items_per_page=items_per_page, | |||
|
110 | **filter_settings) | |||
|
111 | return paginator | |||
|
112 | ||||
|
113 | @classmethod | |||
|
114 | def get_valid_channels(cls, user): | |||
|
115 | return [channel for channel in user.alert_channels | |||
|
116 | if channel.channel_validated] | |||
|
117 | ||||
|
118 | @classmethod | |||
|
119 | def report_notify(cls, user, request, application, report_groups, | |||
|
120 | occurence_dict, since_when=None, db_session=None): | |||
|
121 | db_session = get_db_session(db_session) | |||
|
122 | if not report_groups: | |||
|
123 | return True | |||
|
124 | ||||
|
125 | if not since_when: | |||
|
126 | since_when = datetime.utcnow() | |||
|
127 | for channel in cls.get_valid_channels(user): | |||
|
128 | confirmed_groups = [] | |||
|
129 | ||||
|
130 | for group in report_groups: | |||
|
131 | occurences = occurence_dict.get(group.id, 1) | |||
|
132 | for action in channel.channel_actions: | |||
|
133 | not_matched = ( | |||
|
134 | action.resource_id and action.resource_id != | |||
|
135 | application.resource_id) | |||
|
136 | if action.type != 'report' or not_matched: | |||
|
137 | continue | |||
|
138 | should_notify = (action.action == 'always' or | |||
|
139 | not group.notified) | |||
|
140 | rule_obj = Rule(action.rule, REPORT_TYPE_MATRIX) | |||
|
141 | report_dict = group.get_report().get_dict(request) | |||
|
142 | if rule_obj.match(report_dict) and should_notify: | |||
|
143 | item = GroupOccurence(occurences, group) | |||
|
144 | if item not in confirmed_groups: | |||
|
145 | confirmed_groups.append(item) | |||
|
146 | ||||
|
147 | # send individual reports | |||
|
148 | total_confirmed = len(confirmed_groups) | |||
|
149 | if not total_confirmed: | |||
|
150 | continue | |||
|
151 | try: | |||
|
152 | channel.notify_reports(resource=application, | |||
|
153 | user=user, | |||
|
154 | request=request, | |||
|
155 | since_when=since_when, | |||
|
156 | reports=confirmed_groups) | |||
|
157 | except IntegrationException as e: | |||
|
158 | log.warning('%s' % e) |
@@ -0,0 +1,120 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import sqlalchemy as sa | |||
|
23 | import hashlib | |||
|
24 | ||||
|
25 | from datetime import datetime, timedelta | |||
|
26 | from appenlight.models import Base | |||
|
27 | from sqlalchemy.dialects.postgresql import JSON | |||
|
28 | from ziggurat_foundations.models.base import BaseModel | |||
|
29 | ||||
|
30 | ||||
|
31 | class SlowCall(Base, BaseModel): | |||
|
32 | __tablename__ = 'slow_calls' | |||
|
33 | __table_args__ = {'implicit_returning': False} | |||
|
34 | ||||
|
35 | resource_id = sa.Column(sa.Integer(), nullable=False, index=True) | |||
|
36 | id = sa.Column(sa.Integer, nullable=False, primary_key=True) | |||
|
37 | report_id = sa.Column(sa.BigInteger, | |||
|
38 | sa.ForeignKey('reports.id', | |||
|
39 | ondelete='cascade', | |||
|
40 | onupdate='cascade'), | |||
|
41 | primary_key=True) | |||
|
42 | duration = sa.Column(sa.Float(), default=0) | |||
|
43 | statement = sa.Column(sa.UnicodeText(), default='') | |||
|
44 | statement_hash = sa.Column(sa.Unicode(60), default='') | |||
|
45 | parameters = sa.Column(JSON(), nullable=False, default=dict) | |||
|
46 | type = sa.Column(sa.Unicode(16), default='') | |||
|
47 | subtype = sa.Column(sa.Unicode(16), default=None) | |||
|
48 | location = sa.Column(sa.Unicode(255), default='') | |||
|
49 | timestamp = sa.Column(sa.DateTime(), default=datetime.utcnow, | |||
|
50 | server_default=sa.func.now()) | |||
|
51 | report_group_time = sa.Column(sa.DateTime(), default=datetime.utcnow, | |||
|
52 | server_default=sa.func.now()) | |||
|
53 | ||||
|
54 | def set_data(self, data, protocol_version=None, resource_id=None, | |||
|
55 | report_group=None): | |||
|
56 | self.resource_id = resource_id | |||
|
57 | if data.get('start') and data.get('end'): | |||
|
58 | self.timestamp = data.get('start') | |||
|
59 | d = data.get('end') - data.get('start') | |||
|
60 | self.duration = d.total_seconds() | |||
|
61 | self.statement = data.get('statement', '') | |||
|
62 | self.type = data.get('type', 'unknown')[:16] | |||
|
63 | self.parameters = data.get('parameters', {}) | |||
|
64 | self.location = data.get('location', '')[:255] | |||
|
65 | self.report_group_time = report_group.first_timestamp | |||
|
66 | if 'subtype' in data: | |||
|
67 | self.subtype = data.get('subtype', 'unknown')[:16] | |||
|
68 | if self.type == 'tmpl': | |||
|
69 | self.set_hash('{} {}'.format(self.statement, self.parameters)) | |||
|
70 | else: | |||
|
71 | self.set_hash() | |||
|
72 | ||||
|
73 | def set_hash(self, custom_statement=None): | |||
|
74 | statement = custom_statement or self.statement | |||
|
75 | self.statement_hash = hashlib.sha1( | |||
|
76 | statement.encode('utf8')).hexdigest() | |||
|
77 | ||||
|
78 | @property | |||
|
79 | def end_time(self): | |||
|
80 | if self.duration and self.timestamp: | |||
|
81 | return self.timestamp + timedelta(seconds=self.duration) | |||
|
82 | return None | |||
|
83 | ||||
|
84 | def get_dict(self): | |||
|
85 | instance_dict = super(SlowCall, self).get_dict() | |||
|
86 | instance_dict['children'] = [] | |||
|
87 | instance_dict['end_time'] = self.end_time | |||
|
88 | return instance_dict | |||
|
89 | ||||
|
90 | def es_doc(self): | |||
|
91 | doc = { | |||
|
92 | 'resource_id': self.resource_id, | |||
|
93 | 'timestamp': self.timestamp, | |||
|
94 | 'pg_id': str(self.id), | |||
|
95 | 'permanent': False, | |||
|
96 | 'request_id': None, | |||
|
97 | 'log_level': 'UNKNOWN', | |||
|
98 | 'message': self.statement, | |||
|
99 | 'namespace': 'appenlight.slow_call', | |||
|
100 | 'tags': { | |||
|
101 | 'report_id': {'values': self.report_id, | |||
|
102 | 'numeric_values': self.report_id}, | |||
|
103 | 'duration': {'values': None, 'numeric_values': self.duration}, | |||
|
104 | 'statement_hash': {'values': self.statement_hash, | |||
|
105 | 'numeric_values': None}, | |||
|
106 | 'type': {'values': self.type, 'numeric_values': None}, | |||
|
107 | 'subtype': {'values': self.subtype, 'numeric_values': None}, | |||
|
108 | 'location': {'values': self.location, 'numeric_values': None}, | |||
|
109 | 'parameters': {'values': None, 'numeric_values': None} | |||
|
110 | }, | |||
|
111 | 'tag_list': ['report_id', 'duration', 'statement_hash', 'type', | |||
|
112 | 'subtype', 'location'] | |||
|
113 | } | |||
|
114 | if isinstance(self.parameters, str): | |||
|
115 | doc['tags']['parameters']['values'] = self.parameters[:255] | |||
|
116 | return doc | |||
|
117 | ||||
|
118 | @property | |||
|
119 | def partition_id(self): | |||
|
120 | return 'rcae_sc_%s' % self.report_group_time.strftime('%Y_%m') |
@@ -0,0 +1,42 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import sqlalchemy as sa | |||
|
23 | from datetime import datetime | |||
|
24 | from ziggurat_foundations.models.base import BaseModel | |||
|
25 | from sqlalchemy.dialects.postgres import JSON | |||
|
26 | ||||
|
27 | from . import Base | |||
|
28 | ||||
|
29 | ||||
|
30 | class Tag(Base, BaseModel): | |||
|
31 | __tablename__ = 'tags' | |||
|
32 | ||||
|
33 | id = sa.Column(sa.Integer, primary_key=True) | |||
|
34 | resource_id = sa.Column(sa.Integer, | |||
|
35 | sa.ForeignKey('resources.resource_id')) | |||
|
36 | name = sa.Column(sa.Unicode(512), nullable=False) | |||
|
37 | value = sa.Column(JSON, nullable=False) | |||
|
38 | first_timestamp = sa.Column(sa.DateTime(), default=datetime.utcnow, | |||
|
39 | server_default=sa.func.now()) | |||
|
40 | last_timestamp = sa.Column(sa.DateTime(), default=datetime.utcnow, | |||
|
41 | server_default=sa.func.now()) | |||
|
42 | times_seen = sa.Column(sa.Integer, nullable=False, default=0) |
@@ -0,0 +1,138 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | import sqlalchemy as sa | |||
|
24 | from datetime import datetime | |||
|
25 | from appenlight.models import Base, get_db_session | |||
|
26 | from appenlight.models.services.event import EventService | |||
|
27 | from appenlight.models.integrations import IntegrationException | |||
|
28 | from pyramid.threadlocal import get_current_request | |||
|
29 | from ziggurat_foundations.models.user import UserMixin | |||
|
30 | ||||
|
31 | log = logging.getLogger(__name__) | |||
|
32 | ||||
|
33 | ||||
|
34 | class User(UserMixin, Base): | |||
|
35 | __possible_permissions__ = [] | |||
|
36 | ||||
|
37 | first_name = sa.Column(sa.Unicode(25)) | |||
|
38 | last_name = sa.Column(sa.Unicode(25)) | |||
|
39 | company_name = sa.Column(sa.Unicode(255), default='') | |||
|
40 | company_address = sa.Column(sa.Unicode(255), default='') | |||
|
41 | zip_code = sa.Column(sa.Unicode(25), default='') | |||
|
42 | city = sa.Column(sa.Unicode(50), default='') | |||
|
43 | default_report_sort = sa.Column(sa.Unicode(25), default='newest') | |||
|
44 | notes = sa.Column(sa.UnicodeText, default='') | |||
|
45 | notifications = sa.Column(sa.Boolean(), default=True) | |||
|
46 | registration_ip = sa.Column(sa.UnicodeText(), default='') | |||
|
47 | alert_channels = sa.orm.relationship('AlertChannel', | |||
|
48 | cascade="all,delete-orphan", | |||
|
49 | passive_deletes=True, | |||
|
50 | passive_updates=True, | |||
|
51 | backref='owner', | |||
|
52 | order_by='AlertChannel.channel_name, ' | |||
|
53 | 'AlertChannel.channel_value') | |||
|
54 | ||||
|
55 | alert_actions = sa.orm.relationship('AlertChannelAction', | |||
|
56 | cascade="all,delete-orphan", | |||
|
57 | passive_deletes=True, | |||
|
58 | passive_updates=True, | |||
|
59 | backref='owner', | |||
|
60 | order_by='AlertChannelAction.pkey') | |||
|
61 | ||||
|
62 | auth_tokens = sa.orm.relationship('AuthToken', | |||
|
63 | cascade="all,delete-orphan", | |||
|
64 | passive_deletes=True, | |||
|
65 | passive_updates=True, | |||
|
66 | backref='owner', | |||
|
67 | order_by='AuthToken.creation_date') | |||
|
68 | ||||
|
69 | def get_dict(self, exclude_keys=None, include_keys=None, | |||
|
70 | extended_info=False): | |||
|
71 | result = super(User, self).get_dict(exclude_keys, include_keys) | |||
|
72 | if extended_info: | |||
|
73 | result['groups'] = [g.group_name for g in self.groups] | |||
|
74 | result['permissions'] = [p.perm_name for p in self.permissions] | |||
|
75 | request = get_current_request() | |||
|
76 | apps = self.resources_with_perms( | |||
|
77 | ['view'], resource_types=['application']) | |||
|
78 | result['applications'] = sorted( | |||
|
79 | [{'resource_id': a.resource_id, | |||
|
80 | 'resource_name': a.resource_name} | |||
|
81 | for a in apps.all()], | |||
|
82 | key=lambda x: x['resource_name'].lower()) | |||
|
83 | result['assigned_reports'] = [r.get_dict(request) for r | |||
|
84 | in self.assigned_report_groups] | |||
|
85 | result['latest_events'] = [ev.get_dict(request) for ev | |||
|
86 | in self.latest_events()] | |||
|
87 | ||||
|
88 | exclude_keys_list = exclude_keys or [] | |||
|
89 | include_keys_list = include_keys or [] | |||
|
90 | d = {} | |||
|
91 | for k in result.keys(): | |||
|
92 | if (k not in exclude_keys_list and | |||
|
93 | (k in include_keys_list or not include_keys)): | |||
|
94 | d[k] = result[k] | |||
|
95 | return d | |||
|
96 | ||||
|
97 | def __repr__(self): | |||
|
98 | return '<User: %s, id: %s>' % (self.user_name, self.id) | |||
|
99 | ||||
|
100 | @property | |||
|
101 | def assigned_report_groups(self): | |||
|
102 | from appenlight.models.report_group import ReportGroup | |||
|
103 | ||||
|
104 | resources = self.resources_with_perms( | |||
|
105 | ['view'], resource_types=['application']) | |||
|
106 | query = self.assigned_reports_relation | |||
|
107 | rid_list = [r.resource_id for r in resources] | |||
|
108 | query = query.filter(ReportGroup.resource_id.in_(rid_list)) | |||
|
109 | query = query.limit(50) | |||
|
110 | return query | |||
|
111 | ||||
|
112 | def feed_report(self, report): | |||
|
113 | """ """ | |||
|
114 | if not hasattr(self, 'current_reports'): | |||
|
115 | self.current_reports = [] | |||
|
116 | self.current_reports.append(report) | |||
|
117 | ||||
|
118 | def send_digest(self, request, application, reports, since_when=None, | |||
|
119 | db_session=None): | |||
|
120 | db_session = get_db_session(db_session) | |||
|
121 | if not reports: | |||
|
122 | return True | |||
|
123 | if not since_when: | |||
|
124 | since_when = datetime.utcnow() | |||
|
125 | for channel in self.alert_channels: | |||
|
126 | if not channel.channel_validated or not channel.daily_digest: | |||
|
127 | continue | |||
|
128 | try: | |||
|
129 | channel.send_digest(resource=application, | |||
|
130 | user=self, | |||
|
131 | request=request, | |||
|
132 | since_when=since_when, | |||
|
133 | reports=reports) | |||
|
134 | except IntegrationException as e: | |||
|
135 | log.warning('%s' % e) | |||
|
136 | ||||
|
137 | def latest_events(self): | |||
|
138 | return EventService.latest_for_user(self) |
@@ -0,0 +1,27 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from ziggurat_foundations.models.user_group import UserGroupMixin | |||
|
23 | from appenlight.models import Base | |||
|
24 | ||||
|
25 | ||||
|
26 | class UserGroup(UserGroupMixin, Base): | |||
|
27 | pass |
@@ -0,0 +1,27 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from ziggurat_foundations.models.user_permission import UserPermissionMixin | |||
|
23 | from appenlight.models import Base | |||
|
24 | ||||
|
25 | ||||
|
26 | class UserPermission(UserPermissionMixin, Base): | |||
|
27 | pass |
@@ -0,0 +1,27 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | from ziggurat_foundations.models.user_resource_permission import UserResourcePermissionMixin | |||
|
23 | from appenlight.models import Base | |||
|
24 | ||||
|
25 | ||||
|
26 | class UserResourcePermission(UserResourcePermissionMixin, Base): | |||
|
27 | pass |
@@ -0,0 +1,148 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | from appenlight.forms import CSRFException | |||
|
24 | ||||
|
25 | log = logging.getLogger(__name__) | |||
|
26 | ||||
|
27 | from pyramid.interfaces import IDefaultCSRFOptions | |||
|
28 | from pyramid.session import ( | |||
|
29 | check_csrf_origin, | |||
|
30 | check_csrf_token, | |||
|
31 | ) | |||
|
32 | ||||
|
33 | # taken directly from pyramid 1.7 | |||
|
34 | # pyramid/viewderivers.py | |||
|
35 | # the difference is this deriver will ignore csrf_check when auth token | |||
|
36 | # policy is in effect | |||
|
37 | ||||
|
38 | def csrf_view(view, info): | |||
|
39 | explicit_val = info.options.get('require_csrf') | |||
|
40 | defaults = info.registry.queryUtility(IDefaultCSRFOptions) | |||
|
41 | if defaults is None: | |||
|
42 | default_val = False | |||
|
43 | token = 'csrf_token' | |||
|
44 | header = 'X-CSRF-Token' | |||
|
45 | safe_methods = frozenset(["GET", "HEAD", "OPTIONS", "TRACE"]) | |||
|
46 | else: | |||
|
47 | default_val = defaults.require_csrf | |||
|
48 | token = defaults.token | |||
|
49 | header = defaults.header | |||
|
50 | safe_methods = defaults.safe_methods | |||
|
51 | enabled = ( | |||
|
52 | explicit_val is True or | |||
|
53 | (explicit_val is not False and default_val) | |||
|
54 | ) | |||
|
55 | # disable if both header and token are disabled | |||
|
56 | enabled = enabled and (token or header) | |||
|
57 | wrapped_view = view | |||
|
58 | if enabled: | |||
|
59 | def csrf_view(context, request): | |||
|
60 | is_from_auth_token = 'auth:auth_token' in \ | |||
|
61 | request.effective_principals | |||
|
62 | if is_from_auth_token: | |||
|
63 | log.debug('ignoring CSRF check, auth token used') | |||
|
64 | elif ( | |||
|
65 | request.method not in safe_methods and | |||
|
66 | ( | |||
|
67 | # skip exception views unless value is explicitly defined | |||
|
68 | getattr(request, 'exception', None) is None or | |||
|
69 | explicit_val is not None | |||
|
70 | ) | |||
|
71 | ): | |||
|
72 | check_csrf_origin(request, raises=True) | |||
|
73 | check_csrf_token(request, token, header, raises=True) | |||
|
74 | return view(context, request) | |||
|
75 | ||||
|
76 | wrapped_view = csrf_view | |||
|
77 | return wrapped_view | |||
|
78 | ||||
|
79 | csrf_view.options = ('require_csrf',) | |||
|
80 | ||||
|
81 | ||||
|
82 | class PublicReportGroup(object): | |||
|
83 | def __init__(self, val, config): | |||
|
84 | self.val = val | |||
|
85 | ||||
|
86 | def text(self): | |||
|
87 | return 'public_report_group = %s' % (self.val,) | |||
|
88 | ||||
|
89 | phash = text | |||
|
90 | ||||
|
91 | def __call__(self, context, request): | |||
|
92 | report_group = getattr(context, 'report_group', None) | |||
|
93 | if report_group: | |||
|
94 | return context.report_group.public == self.val | |||
|
95 | ||||
|
96 | ||||
|
97 | class contextTypeClass(object): | |||
|
98 | def __init__(self, context_property, config): | |||
|
99 | self.context_property = context_property[0] | |||
|
100 | self.cls = context_property[1] | |||
|
101 | ||||
|
102 | def text(self): | |||
|
103 | return 'context_type_class = %s, %s' % ( | |||
|
104 | self.context_property, self.cls) | |||
|
105 | ||||
|
106 | phash = text | |||
|
107 | ||||
|
108 | def __call__(self, context, request): | |||
|
109 | to_check = getattr(context, self.context_property, None) | |||
|
110 | return isinstance(to_check, self.cls) | |||
|
111 | ||||
|
112 | ||||
|
113 | def unauthed_report_predicate(context, request): | |||
|
114 | """ | |||
|
115 | This allows the user to access the view if context object public | |||
|
116 | flag is True | |||
|
117 | """ | |||
|
118 | if context.public: | |||
|
119 | return True | |||
|
120 | ||||
|
121 | ||||
|
122 | def unauthed_report_predicate_inv(context, request): | |||
|
123 | """ | |||
|
124 | This allows the user to access the view if context object public | |||
|
125 | flag is NOT True | |||
|
126 | """ | |||
|
127 | if context.public: | |||
|
128 | return False | |||
|
129 | return True | |||
|
130 | ||||
|
131 | ||||
|
132 | def unauthed_report_predicate(context, request): | |||
|
133 | """ | |||
|
134 | This allows the user to access the view if context object public | |||
|
135 | flag is True | |||
|
136 | """ | |||
|
137 | if context.public: | |||
|
138 | return True | |||
|
139 | ||||
|
140 | ||||
|
141 | def unauthed_report_predicate_inv(context, request): | |||
|
142 | """ | |||
|
143 | This allows the user to access the view if context object public | |||
|
144 | flag is NOT True | |||
|
145 | """ | |||
|
146 | if context.public: | |||
|
147 | return False | |||
|
148 | return True |
@@ -0,0 +1,21 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 |
@@ -0,0 +1,68 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import logging | |||
|
23 | import argparse | |||
|
24 | from pyramid.paster import setup_logging | |||
|
25 | from pyramid.paster import bootstrap | |||
|
26 | from appenlight.celery.tasks import logs_cleanup | |||
|
27 | ||||
|
28 | log = logging.getLogger(__name__) | |||
|
29 | ||||
|
30 | ||||
|
31 | def main(): | |||
|
32 | choices = ['logs'] | |||
|
33 | ||||
|
34 | parser = argparse.ArgumentParser(description='Cleanup App Enlight logs') | |||
|
35 | parser.add_argument('-c', '--config', required=True, | |||
|
36 | help='Configuration ini file of application') | |||
|
37 | parser.add_argument('-t', '--types', choices=choices, | |||
|
38 | default='logs', | |||
|
39 | help='Which parts of database should get cleared') | |||
|
40 | parser.add_argument('-r', '--resource', required=True, help='Resource id') | |||
|
41 | parser.add_argument('-n', '--namespace', help='Limit to Namespace') | |||
|
42 | args = parser.parse_args() | |||
|
43 | ||||
|
44 | config_uri = args.config | |||
|
45 | setup_logging(config_uri) | |||
|
46 | log.setLevel(logging.INFO) | |||
|
47 | env = bootstrap(config_uri) | |||
|
48 | ||||
|
49 | config = { | |||
|
50 | 'types': args.types, | |||
|
51 | 'namespace': args.namespace, | |||
|
52 | 'resource': int(args.resource), | |||
|
53 | } | |||
|
54 | ||||
|
55 | action_cleanup_logs(config) | |||
|
56 | ||||
|
57 | ||||
|
58 | def action_cleanup_logs(config): | |||
|
59 | filter_settings = { | |||
|
60 | 'namespace': [] | |||
|
61 | } | |||
|
62 | if config['namespace']: | |||
|
63 | filter_settings['namespace'].append(config['namespace']) | |||
|
64 | logs_cleanup(config['resource'], filter_settings) | |||
|
65 | ||||
|
66 | ||||
|
67 | if __name__ == '__main__': | |||
|
68 | main() |
@@ -0,0 +1,158 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import argparse | |||
|
23 | import getpass | |||
|
24 | import logging | |||
|
25 | ||||
|
26 | from pyramid.paster import setup_logging, bootstrap | |||
|
27 | from pyramid.threadlocal import get_current_request | |||
|
28 | ||||
|
29 | from appenlight.forms import UserRegisterForm | |||
|
30 | from appenlight.lib.ext_json import json | |||
|
31 | from appenlight.models import ( | |||
|
32 | DBSession, | |||
|
33 | Group, | |||
|
34 | GroupPermission, | |||
|
35 | User, | |||
|
36 | AuthToken | |||
|
37 | ) | |||
|
38 | from appenlight.models.services.group import GroupService | |||
|
39 | ||||
|
40 | log = logging.getLogger(__name__) | |||
|
41 | ||||
|
42 | _ = str | |||
|
43 | ||||
|
44 | ||||
|
45 | def is_yes(input_data): | |||
|
46 | return input_data in ['y', 'yes'] | |||
|
47 | ||||
|
48 | ||||
|
49 | def is_no(input_data): | |||
|
50 | return input_data in ['n', 'no'] | |||
|
51 | ||||
|
52 | ||||
|
53 | def main(): | |||
|
54 | parser = argparse.ArgumentParser( | |||
|
55 | description='Populate App Enlight database', | |||
|
56 | add_help=False) | |||
|
57 | parser.add_argument('-c', '--config', required=True, | |||
|
58 | help='Configuration ini file of application') | |||
|
59 | parser.add_argument('--username', default=None, | |||
|
60 | help='User to create') | |||
|
61 | parser.add_argument('--password', default=None, | |||
|
62 | help='Password for created user') | |||
|
63 | parser.add_argument('--email', default=None, | |||
|
64 | help='Email for created user') | |||
|
65 | parser.add_argument('--auth-token', default=None, | |||
|
66 | help='Auth token for created user') | |||
|
67 | args = parser.parse_args() | |||
|
68 | config_uri = args.config | |||
|
69 | ||||
|
70 | setup_logging(config_uri) | |||
|
71 | env = bootstrap(config_uri) | |||
|
72 | request = env['request'] | |||
|
73 | with get_current_request().tm: | |||
|
74 | group = GroupService.by_id(1) | |||
|
75 | if not group: | |||
|
76 | group = Group(id=1, group_name='Administrators', | |||
|
77 | description="Top level permission owners") | |||
|
78 | DBSession.add(group) | |||
|
79 | permission = GroupPermission(perm_name='root_administration') | |||
|
80 | group.permissions.append(permission) | |||
|
81 | ||||
|
82 | create_user = True if args.username else None | |||
|
83 | while create_user is None: | |||
|
84 | response = input( | |||
|
85 | 'Do you want to create a new admin? (n)\n').lower() | |||
|
86 | ||||
|
87 | if is_yes(response or 'n'): | |||
|
88 | create_user = True | |||
|
89 | elif is_no(response or 'n'): | |||
|
90 | create_user = False | |||
|
91 | ||||
|
92 | if create_user: | |||
|
93 | csrf_token = request.session.get_csrf_token() | |||
|
94 | user_name = args.username | |||
|
95 | print('*********************************************************') | |||
|
96 | while user_name is None: | |||
|
97 | response = input('What is the username of new admin?\n') | |||
|
98 | form = UserRegisterForm( | |||
|
99 | user_name=response, csrf_token=csrf_token, | |||
|
100 | csrf_context=request) | |||
|
101 | form.validate() | |||
|
102 | if form.user_name.errors: | |||
|
103 | print(form.user_name.errors[0]) | |||
|
104 | else: | |||
|
105 | user_name = response | |||
|
106 | print('The admin username is "{}"\n'.format(user_name)) | |||
|
107 | print('*********************************************************') | |||
|
108 | email = args.email | |||
|
109 | while email is None: | |||
|
110 | response = input('What is the email of admin account?\n') | |||
|
111 | form = UserRegisterForm( | |||
|
112 | email=response, csrf_token=csrf_token, | |||
|
113 | csrf_context=request) | |||
|
114 | form.validate() | |||
|
115 | if form.email.errors: | |||
|
116 | print(form.email.errors[0]) | |||
|
117 | else: | |||
|
118 | email = response | |||
|
119 | print('The admin email is "{}"\n'.format(email)) | |||
|
120 | print('*********************************************************') | |||
|
121 | user_password = args.password | |||
|
122 | confirmed_password = args.password | |||
|
123 | while user_password is None or confirmed_password is None: | |||
|
124 | response = getpass.getpass( | |||
|
125 | 'What is the password for admin account?\n') | |||
|
126 | form = UserRegisterForm( | |||
|
127 | user_password=response, csrf_token=csrf_token, | |||
|
128 | csrf_context=request) | |||
|
129 | form.validate() | |||
|
130 | if form.user_password.errors: | |||
|
131 | print(form.user_password.errors[0]) | |||
|
132 | else: | |||
|
133 | user_password = response | |||
|
134 | ||||
|
135 | response = getpass.getpass('Please confirm the password.\n') | |||
|
136 | if user_password == response: | |||
|
137 | confirmed_password = response | |||
|
138 | else: | |||
|
139 | print('Passwords do not match. Please try again') | |||
|
140 | print('*********************************************************') | |||
|
141 | ||||
|
142 | with get_current_request().tm: | |||
|
143 | if create_user: | |||
|
144 | group = GroupService.by_id(1) | |||
|
145 | user = User(user_name=user_name, email=email, status=1) | |||
|
146 | user.regenerate_security_code() | |||
|
147 | user.set_password(user_password) | |||
|
148 | DBSession.add(user) | |||
|
149 | token = AuthToken(description="Uptime monitoring token") | |||
|
150 | if args.auth_token: | |||
|
151 | token.token = args.auth_token | |||
|
152 | user.auth_tokens.append(token) | |||
|
153 | group.users.append(user) | |||
|
154 | print('USER CREATED') | |||
|
155 | print(json.dumps(user.get_dict())) | |||
|
156 | print('*********************************************************') | |||
|
157 | print('AUTH TOKEN') | |||
|
158 | print(json.dumps(user.auth_tokens[0].get_dict())) |
@@ -0,0 +1,62 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import argparse | |||
|
23 | import logging | |||
|
24 | import os | |||
|
25 | import pkg_resources | |||
|
26 | import jinja2 | |||
|
27 | from cryptography.fernet import Fernet | |||
|
28 | ||||
|
29 | ||||
|
30 | log = logging.getLogger(__name__) | |||
|
31 | ||||
|
32 | ||||
|
33 | def gen_secret(): | |||
|
34 | return Fernet.generate_key().decode('utf8') | |||
|
35 | ||||
|
36 | ||||
|
37 | def main(): | |||
|
38 | parser = argparse.ArgumentParser( | |||
|
39 | description='Generate App Enlight static resources', | |||
|
40 | add_help=False) | |||
|
41 | parser.add_argument('config', help='Name of generated file') | |||
|
42 | parser.add_argument( | |||
|
43 | '--domain', | |||
|
44 | default='appenlight-rhodecode.local', | |||
|
45 | help='Domain which will be used to serve the application') | |||
|
46 | parser.add_argument( | |||
|
47 | '--dbstring', | |||
|
48 | default='postgresql://appenlight:test@127.0.0.1:5432/appenlight', | |||
|
49 | help='Domain which will be used to serve the application') | |||
|
50 | args = parser.parse_args() | |||
|
51 | ini_path = os.path.join('templates', 'ini', 'production.ini.jinja2') | |||
|
52 | template_str = pkg_resources.resource_string('appenlight', ini_path) | |||
|
53 | template = jinja2.Template(template_str.decode('utf8')) | |||
|
54 | template_vars = {'appenlight_encryption_secret': gen_secret(), | |||
|
55 | 'appenlight_authtkt_secret': gen_secret(), | |||
|
56 | 'appenlight_redis_session_secret': gen_secret(), | |||
|
57 | 'appenlight_domain': args.domain, | |||
|
58 | 'appenlight_dbstring': args.dbstring, | |||
|
59 | } | |||
|
60 | compiled = template.render(**template_vars) | |||
|
61 | with open(args.config, 'w') as f: | |||
|
62 | f.write(compiled) |
@@ -0,0 +1,75 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # App Enlight Enterprise Edition, including its added features, Support | |||
|
19 | # services, and proprietary license terms, please see | |||
|
20 | # https://rhodecode.com/licenses/ | |||
|
21 | ||||
|
22 | import argparse | |||
|
23 | import logging | |||
|
24 | import sys | |||
|
25 | ||||
|
26 | from alembic.config import Config | |||
|
27 | from alembic import command | |||
|
28 | from pyramid.paster import setup_logging, bootstrap | |||
|
29 | from pyramid.threadlocal import get_current_registry, get_current_request | |||
|
30 | ||||
|
31 | from appenlight.lib import get_callable | |||
|
32 | from appenlight.models.services.config import ConfigService | |||
|
33 | ||||
|
34 | log = logging.getLogger(__name__) | |||
|
35 | ||||
|
36 | ||||
|
37 | def main(argv=sys.argv): | |||
|
38 | parser = argparse.ArgumentParser( | |||
|
39 | description='Migrate App Enlight database to latest version', | |||
|
40 | add_help=False) | |||
|
41 | parser.add_argument('-c', '--config', required=True, | |||
|
42 | help='Configuration ini file of application') | |||
|
43 | args = parser.parse_args() | |||
|
44 | config_uri = args.config | |||
|
45 | ||||
|
46 | setup_logging(config_uri) | |||
|
47 | bootstrap(config_uri) | |||
|
48 | registry = get_current_registry() | |||
|
49 | alembic_cfg = Config() | |||
|
50 | alembic_cfg.set_main_option("script_location", | |||
|
51 | "ziggurat_foundations:migrations") | |||
|
52 | alembic_cfg.set_main_option("sqlalchemy.url", | |||
|
53 | registry.settings["sqlalchemy.url"]) | |||
|
54 | command.upgrade(alembic_cfg, "head") | |||
|
55 | alembic_cfg = Config() | |||
|
56 | alembic_cfg.set_main_option("script_location", "appenlight:migrations") | |||
|
57 | alembic_cfg.set_main_option("sqlalchemy.url", | |||
|
58 | registry.settings["sqlalchemy.url"]) | |||
|
59 | command.upgrade(alembic_cfg, "head") | |||
|
60 | ||||
|
61 | for plugin_name, config in registry.appenlight_plugins.items(): | |||
|
62 | if config['sqlalchemy_migrations']: | |||
|
63 | alembic_cfg = Config() | |||
|
64 | alembic_cfg.set_main_option("script_location", | |||
|
65 | config['sqlalchemy_migrations']) | |||
|
66 | alembic_cfg.set_main_option("sqlalchemy.url", | |||
|
67 | registry.settings["sqlalchemy.url"]) | |||
|
68 | command.upgrade(alembic_cfg, "head") | |||
|
69 | ||||
|
70 | with get_current_request().tm: | |||
|
71 | ConfigService.setup_default_values() | |||
|
72 | ||||
|
73 | for plugin_name, config in registry.appenlight_plugins.items(): | |||
|
74 | if config['default_values_setter']: | |||
|
75 | get_callable(config['default_values_setter'])() |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100755, binary diff hidden |
|
NO CONTENT: new file 100755, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
General Comments 0
You need to be logged in to leave comments.
Login now