Show More
The requested changes are too big and content was truncated. Show full diff
This diff has been collapsed as it changes many lines, (843 lines changed) Show them Hide them | |||
@@ -1,704 +1,219 b'' | |||
|
1 | 1 | This program is free software: you can redistribute it and/or modify |
|
2 | 2 | it under the terms of the GNU Affero General Public License, version 3 |
|
3 | 3 | (only), as published by the Free Software Foundation. |
|
4 | 4 | |
|
5 | 5 | |
|
6 | 6 | This program incorporates work covered by the following copyright and |
|
7 | 7 | permission notice: |
|
8 | 8 | |
|
9 | 9 | Copyright (c) 2014-2016 - packaging |
|
10 | 10 | file: |
|
11 | 11 | Copyright (c) 2008-2011 - msgpack-python |
|
12 | 12 | file:licenses/msgpack_license.txt |
|
13 | 13 | Copyright (c) 2007-2008 - amqp |
|
14 | 14 | file:licenses/amqp_license.txt |
|
15 | 15 | Copyright (c) 2013 - bcrypt |
|
16 | 16 | file:licenses/bcrypt_license.txt |
|
17 | 17 | Copyright (c) 2015 - elasticsearch |
|
18 | 18 | file:licenses/elasticsearch_license.txt |
|
19 | 19 | Copyright (c) 2011-2013 - gevent-websocket |
|
20 | 20 | file:licenses/gevent_websocket_license.txt |
|
21 | 21 | Copyright (c) 2015 - python-editor |
|
22 | 22 | file:licenses/python_editor_license.txt |
|
23 | 23 | Copyright (c) 2015 - requests |
|
24 | 24 | file:licenses/requests_license.txt |
|
25 | 25 | Copyright (c) 2014 - requests-toolbelt |
|
26 | 26 | file:licenses/requests_toolbelt_license.txt |
|
27 | 27 | |
|
28 | 28 | Both licensed under the Apache License, Version 2.0 (the "License"); |
|
29 | 29 | you may not use this file except in compliance with the License. |
|
30 | 30 | You may obtain a copy of the License at |
|
31 | 31 | |
|
32 | 32 | http://www.apache.org/licenses/LICENSE-2.0 |
|
33 | 33 | |
|
34 | 34 | Unless required by applicable law or agreed to in writing, software |
|
35 | 35 | distributed under the License is distributed on an "AS IS" BASIS, |
|
36 | 36 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
|
37 | 37 | See the License for the specific language governing permissions and |
|
38 | 38 | imitations under the License. |
|
39 | 39 | |
|
40 | 40 | |
|
41 |
Below is the full text of |
|
|
42 | ||
|
43 | ||
|
44 | GNU AFFERO GENERAL PUBLIC LICENSE | |
|
45 |
Version |
|
|
46 | ||
|
47 | Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/> | |
|
48 | Everyone is permitted to copy and distribute verbatim copies | |
|
49 | of this license document, but changing it is not allowed. | |
|
50 | ||
|
51 | Preamble | |
|
52 | ||
|
53 | The GNU Affero General Public License is a free, copyleft license for | |
|
54 | software and other kinds of works, specifically designed to ensure | |
|
55 | cooperation with the community in the case of network server software. | |
|
56 | ||
|
57 | The licenses for most software and other practical works are designed | |
|
58 | to take away your freedom to share and change the works. By contrast, | |
|
59 | our General Public Licenses are intended to guarantee your freedom to | |
|
60 | share and change all versions of a program--to make sure it remains free | |
|
61 | software for all its users. | |
|
62 | ||
|
63 | When we speak of free software, we are referring to freedom, not | |
|
64 | price. Our General Public Licenses are designed to make sure that you | |
|
65 | have the freedom to distribute copies of free software (and charge for | |
|
66 | them if you wish), that you receive source code or can get it if you | |
|
67 | want it, that you can change the software or use pieces of it in new | |
|
68 | free programs, and that you know you can do these things. | |
|
69 | ||
|
70 | Developers that use our General Public Licenses protect your rights | |
|
71 | with two steps: (1) assert copyright on the software, and (2) offer | |
|
72 | you this License which gives you legal permission to copy, distribute | |
|
73 | and/or modify the software. | |
|
74 | ||
|
75 | A secondary benefit of defending all users' freedom is that | |
|
76 | improvements made in alternate versions of the program, if they | |
|
77 | receive widespread use, become available for other developers to | |
|
78 | incorporate. Many developers of free software are heartened and | |
|
79 | encouraged by the resulting cooperation. However, in the case of | |
|
80 | software used on network servers, this result may fail to come about. | |
|
81 | The GNU General Public License permits making a modified version and | |
|
82 | letting the public access it on a server without ever releasing its | |
|
83 | source code to the public. | |
|
84 | ||
|
85 | The GNU Affero General Public License is designed specifically to | |
|
86 | ensure that, in such cases, the modified source code becomes available | |
|
87 | to the community. It requires the operator of a network server to | |
|
88 | provide the source code of the modified version running there to the | |
|
89 | users of that server. Therefore, public use of a modified version, on | |
|
90 | a publicly accessible server, gives the public access to the source | |
|
91 | code of the modified version. | |
|
92 | ||
|
93 | An older license, called the Affero General Public License and | |
|
94 | published by Affero, was designed to accomplish similar goals. This is | |
|
95 | a different license, not a version of the Affero GPL, but Affero has | |
|
96 | released a new version of the Affero GPL which permits relicensing under | |
|
97 | this license. | |
|
98 | ||
|
99 | The precise terms and conditions for copying, distribution and | |
|
100 | modification follow. | |
|
101 | ||
|
102 | TERMS AND CONDITIONS | |
|
103 | ||
|
104 | 0. Definitions. | |
|
105 | ||
|
106 | "This License" refers to version 3 of the GNU Affero General Public License. | |
|
107 | ||
|
108 | "Copyright" also means copyright-like laws that apply to other kinds of | |
|
109 | works, such as semiconductor masks. | |
|
110 | ||
|
111 | "The Program" refers to any copyrightable work licensed under this | |
|
112 | License. Each licensee is addressed as "you". "Licensees" and | |
|
113 | "recipients" may be individuals or organizations. | |
|
114 | ||
|
115 | To "modify" a work means to copy from or adapt all or part of the work | |
|
116 | in a fashion requiring copyright permission, other than the making of an | |
|
117 | exact copy. The resulting work is called a "modified version" of the | |
|
118 | earlier work or a work "based on" the earlier work. | |
|
119 | ||
|
120 | A "covered work" means either the unmodified Program or a work based | |
|
121 | on the Program. | |
|
122 | ||
|
123 | To "propagate" a work means to do anything with it that, without | |
|
124 | permission, would make you directly or secondarily liable for | |
|
125 | infringement under applicable copyright law, except executing it on a | |
|
126 | computer or modifying a private copy. Propagation includes copying, | |
|
127 | distribution (with or without modification), making available to the | |
|
128 | public, and in some countries other activities as well. | |
|
129 | ||
|
130 | To "convey" a work means any kind of propagation that enables other | |
|
131 | parties to make or receive copies. Mere interaction with a user through | |
|
132 | a computer network, with no transfer of a copy, is not conveying. | |
|
133 | ||
|
134 | An interactive user interface displays "Appropriate Legal Notices" | |
|
135 | to the extent that it includes a convenient and prominently visible | |
|
136 | feature that (1) displays an appropriate copyright notice, and (2) | |
|
137 | tells the user that there is no warranty for the work (except to the | |
|
138 | extent that warranties are provided), that licensees may convey the | |
|
139 | work under this License, and how to view a copy of this License. If | |
|
140 | the interface presents a list of user commands or options, such as a | |
|
141 | menu, a prominent item in the list meets this criterion. | |
|
142 | ||
|
143 | 1. Source Code. | |
|
144 | ||
|
145 | The "source code" for a work means the preferred form of the work | |
|
146 | for making modifications to it. "Object code" means any non-source | |
|
147 | form of a work. | |
|
148 | ||
|
149 | A "Standard Interface" means an interface that either is an official | |
|
150 | standard defined by a recognized standards body, or, in the case of | |
|
151 | interfaces specified for a particular programming language, one that | |
|
152 | is widely used among developers working in that language. | |
|
153 | ||
|
154 | The "System Libraries" of an executable work include anything, other | |
|
155 | than the work as a whole, that (a) is included in the normal form of | |
|
156 | packaging a Major Component, but which is not part of that Major | |
|
157 | Component, and (b) serves only to enable use of the work with that | |
|
158 | Major Component, or to implement a Standard Interface for which an | |
|
159 | implementation is available to the public in source code form. A | |
|
160 | "Major Component", in this context, means a major essential component | |
|
161 | (kernel, window system, and so on) of the specific operating system | |
|
162 | (if any) on which the executable work runs, or a compiler used to | |
|
163 | produce the work, or an object code interpreter used to run it. | |
|
164 | ||
|
165 | The "Corresponding Source" for a work in object code form means all | |
|
166 | the source code needed to generate, install, and (for an executable | |
|
167 | work) run the object code and to modify the work, including scripts to | |
|
168 | control those activities. However, it does not include the work's | |
|
169 | System Libraries, or general-purpose tools or generally available free | |
|
170 | programs which are used unmodified in performing those activities but | |
|
171 | which are not part of the work. For example, Corresponding Source | |
|
172 | includes interface definition files associated with source files for | |
|
173 | the work, and the source code for shared libraries and dynamically | |
|
174 | linked subprograms that the work is specifically designed to require, | |
|
175 | such as by intimate data communication or control flow between those | |
|
176 | subprograms and other parts of the work. | |
|
177 | ||
|
178 | The Corresponding Source need not include anything that users | |
|
179 | can regenerate automatically from other parts of the Corresponding | |
|
180 | Source. | |
|
181 | ||
|
182 | The Corresponding Source for a work in source code form is that | |
|
183 | same work. | |
|
184 | ||
|
185 | 2. Basic Permissions. | |
|
186 | ||
|
187 | All rights granted under this License are granted for the term of | |
|
188 | copyright on the Program, and are irrevocable provided the stated | |
|
189 | conditions are met. This License explicitly affirms your unlimited | |
|
190 | permission to run the unmodified Program. The output from running a | |
|
191 | covered work is covered by this License only if the output, given its | |
|
192 | content, constitutes a covered work. This License acknowledges your | |
|
193 | rights of fair use or other equivalent, as provided by copyright law. | |
|
194 | ||
|
195 | You may make, run and propagate covered works that you do not | |
|
196 | convey, without conditions so long as your license otherwise remains | |
|
197 | in force. You may convey covered works to others for the sole purpose | |
|
198 | of having them make modifications exclusively for you, or provide you | |
|
199 | with facilities for running those works, provided that you comply with | |
|
200 | the terms of this License in conveying all material for which you do | |
|
201 | not control copyright. Those thus making or running the covered works | |
|
202 | for you must do so exclusively on your behalf, under your direction | |
|
203 | and control, on terms that prohibit them from making any copies of | |
|
204 | your copyrighted material outside their relationship with you. | |
|
205 | ||
|
206 | Conveying under any other circumstances is permitted solely under | |
|
207 | the conditions stated below. Sublicensing is not allowed; section 10 | |
|
208 | makes it unnecessary. | |
|
209 | ||
|
210 | 3. Protecting Users' Legal Rights From Anti-Circumvention Law. | |
|
211 | ||
|
212 | No covered work shall be deemed part of an effective technological | |
|
213 | measure under any applicable law fulfilling obligations under article | |
|
214 | 11 of the WIPO copyright treaty adopted on 20 December 1996, or | |
|
215 | similar laws prohibiting or restricting circumvention of such | |
|
216 | measures. | |
|
217 | ||
|
218 | When you convey a covered work, you waive any legal power to forbid | |
|
219 | circumvention of technological measures to the extent such circumvention | |
|
220 | is effected by exercising rights under this License with respect to | |
|
221 | the covered work, and you disclaim any intention to limit operation or | |
|
222 | modification of the work as a means of enforcing, against the work's | |
|
223 | users, your or third parties' legal rights to forbid circumvention of | |
|
224 | technological measures. | |
|
225 | ||
|
226 | 4. Conveying Verbatim Copies. | |
|
227 | ||
|
228 | You may convey verbatim copies of the Program's source code as you | |
|
229 | receive it, in any medium, provided that you conspicuously and | |
|
230 | appropriately publish on each copy an appropriate copyright notice; | |
|
231 | keep intact all notices stating that this License and any | |
|
232 | non-permissive terms added in accord with section 7 apply to the code; | |
|
233 | keep intact all notices of the absence of any warranty; and give all | |
|
234 | recipients a copy of this License along with the Program. | |
|
235 | ||
|
236 | You may charge any price or no price for each copy that you convey, | |
|
237 | and you may offer support or warranty protection for a fee. | |
|
238 | ||
|
239 | 5. Conveying Modified Source Versions. | |
|
240 | ||
|
241 | You may convey a work based on the Program, or the modifications to | |
|
242 | produce it from the Program, in the form of source code under the | |
|
243 | terms of section 4, provided that you also meet all of these conditions: | |
|
244 | ||
|
245 | a) The work must carry prominent notices stating that you modified | |
|
246 | it, and giving a relevant date. | |
|
247 | ||
|
248 | b) The work must carry prominent notices stating that it is | |
|
249 | released under this License and any conditions added under section | |
|
250 | 7. This requirement modifies the requirement in section 4 to | |
|
251 | "keep intact all notices". | |
|
252 | ||
|
253 | c) You must license the entire work, as a whole, under this | |
|
254 | License to anyone who comes into possession of a copy. This | |
|
255 | License will therefore apply, along with any applicable section 7 | |
|
256 | additional terms, to the whole of the work, and all its parts, | |
|
257 | regardless of how they are packaged. This License gives no | |
|
258 | permission to license the work in any other way, but it does not | |
|
259 | invalidate such permission if you have separately received it. | |
|
260 | ||
|
261 | d) If the work has interactive user interfaces, each must display | |
|
262 | Appropriate Legal Notices; however, if the Program has interactive | |
|
263 | interfaces that do not display Appropriate Legal Notices, your | |
|
264 | work need not make them do so. | |
|
265 | ||
|
266 | A compilation of a covered work with other separate and independent | |
|
267 | works, which are not by their nature extensions of the covered work, | |
|
268 | and which are not combined with it such as to form a larger program, | |
|
269 | in or on a volume of a storage or distribution medium, is called an | |
|
270 | "aggregate" if the compilation and its resulting copyright are not | |
|
271 | used to limit the access or legal rights of the compilation's users | |
|
272 | beyond what the individual works permit. Inclusion of a covered work | |
|
273 | in an aggregate does not cause this License to apply to the other | |
|
274 | parts of the aggregate. | |
|
275 | ||
|
276 | 6. Conveying Non-Source Forms. | |
|
277 | ||
|
278 | You may convey a covered work in object code form under the terms | |
|
279 | of sections 4 and 5, provided that you also convey the | |
|
280 | machine-readable Corresponding Source under the terms of this License, | |
|
281 | in one of these ways: | |
|
282 | ||
|
283 | a) Convey the object code in, or embodied in, a physical product | |
|
284 | (including a physical distribution medium), accompanied by the | |
|
285 | Corresponding Source fixed on a durable physical medium | |
|
286 | customarily used for software interchange. | |
|
287 | ||
|
288 | b) Convey the object code in, or embodied in, a physical product | |
|
289 | (including a physical distribution medium), accompanied by a | |
|
290 | written offer, valid for at least three years and valid for as | |
|
291 | long as you offer spare parts or customer support for that product | |
|
292 | model, to give anyone who possesses the object code either (1) a | |
|
293 | copy of the Corresponding Source for all the software in the | |
|
294 | product that is covered by this License, on a durable physical | |
|
295 | medium customarily used for software interchange, for a price no | |
|
296 | more than your reasonable cost of physically performing this | |
|
297 | conveying of source, or (2) access to copy the | |
|
298 | Corresponding Source from a network server at no charge. | |
|
299 | ||
|
300 | c) Convey individual copies of the object code with a copy of the | |
|
301 | written offer to provide the Corresponding Source. This | |
|
302 | alternative is allowed only occasionally and noncommercially, and | |
|
303 | only if you received the object code with such an offer, in accord | |
|
304 | with subsection 6b. | |
|
305 | ||
|
306 | d) Convey the object code by offering access from a designated | |
|
307 | place (gratis or for a charge), and offer equivalent access to the | |
|
308 | Corresponding Source in the same way through the same place at no | |
|
309 | further charge. You need not require recipients to copy the | |
|
310 | Corresponding Source along with the object code. If the place to | |
|
311 | copy the object code is a network server, the Corresponding Source | |
|
312 | may be on a different server (operated by you or a third party) | |
|
313 | that supports equivalent copying facilities, provided you maintain | |
|
314 | clear directions next to the object code saying where to find the | |
|
315 | Corresponding Source. Regardless of what server hosts the | |
|
316 | Corresponding Source, you remain obligated to ensure that it is | |
|
317 | available for as long as needed to satisfy these requirements. | |
|
318 | ||
|
319 | e) Convey the object code using peer-to-peer transmission, provided | |
|
320 | you inform other peers where the object code and Corresponding | |
|
321 | Source of the work are being offered to the general public at no | |
|
322 | charge under subsection 6d. | |
|
323 | ||
|
324 | A separable portion of the object code, whose source code is excluded | |
|
325 | from the Corresponding Source as a System Library, need not be | |
|
326 | included in conveying the object code work. | |
|
327 | ||
|
328 | A "User Product" is either (1) a "consumer product", which means any | |
|
329 | tangible personal property which is normally used for personal, family, | |
|
330 | or household purposes, or (2) anything designed or sold for incorporation | |
|
331 | into a dwelling. In determining whether a product is a consumer product, | |
|
332 | doubtful cases shall be resolved in favor of coverage. For a particular | |
|
333 | product received by a particular user, "normally used" refers to a | |
|
334 | typical or common use of that class of product, regardless of the status | |
|
335 | of the particular user or of the way in which the particular user | |
|
336 | actually uses, or expects or is expected to use, the product. A product | |
|
337 | is a consumer product regardless of whether the product has substantial | |
|
338 | commercial, industrial or non-consumer uses, unless such uses represent | |
|
339 | the only significant mode of use of the product. | |
|
340 | ||
|
341 | "Installation Information" for a User Product means any methods, | |
|
342 | procedures, authorization keys, or other information required to install | |
|
343 | and execute modified versions of a covered work in that User Product from | |
|
344 | a modified version of its Corresponding Source. The information must | |
|
345 | suffice to ensure that the continued functioning of the modified object | |
|
346 | code is in no case prevented or interfered with solely because | |
|
347 | modification has been made. | |
|
348 | ||
|
349 | If you convey an object code work under this section in, or with, or | |
|
350 | specifically for use in, a User Product, and the conveying occurs as | |
|
351 | part of a transaction in which the right of possession and use of the | |
|
352 | User Product is transferred to the recipient in perpetuity or for a | |
|
353 | fixed term (regardless of how the transaction is characterized), the | |
|
354 | Corresponding Source conveyed under this section must be accompanied | |
|
355 | by the Installation Information. But this requirement does not apply | |
|
356 | if neither you nor any third party retains the ability to install | |
|
357 | modified object code on the User Product (for example, the work has | |
|
358 | been installed in ROM). | |
|
359 | ||
|
360 | The requirement to provide Installation Information does not include a | |
|
361 | requirement to continue to provide support service, warranty, or updates | |
|
362 | for a work that has been modified or installed by the recipient, or for | |
|
363 | the User Product in which it has been modified or installed. Access to a | |
|
364 | network may be denied when the modification itself materially and | |
|
365 | adversely affects the operation of the network or violates the rules and | |
|
366 | protocols for communication across the network. | |
|
367 | ||
|
368 | Corresponding Source conveyed, and Installation Information provided, | |
|
369 | in accord with this section must be in a format that is publicly | |
|
370 | documented (and with an implementation available to the public in | |
|
371 | source code form), and must require no special password or key for | |
|
372 | unpacking, reading or copying. | |
|
373 | ||
|
374 | 7. Additional Terms. | |
|
375 | ||
|
376 | "Additional permissions" are terms that supplement the terms of this | |
|
377 | License by making exceptions from one or more of its conditions. | |
|
378 | Additional permissions that are applicable to the entire Program shall | |
|
379 | be treated as though they were included in this License, to the extent | |
|
380 | that they are valid under applicable law. If additional permissions | |
|
381 | apply only to part of the Program, that part may be used separately | |
|
382 | under those permissions, but the entire Program remains governed by | |
|
383 | this License without regard to the additional permissions. | |
|
384 | ||
|
385 | When you convey a copy of a covered work, you may at your option | |
|
386 | remove any additional permissions from that copy, or from any part of | |
|
387 | it. (Additional permissions may be written to require their own | |
|
388 | removal in certain cases when you modify the work.) You may place | |
|
389 | additional permissions on material, added by you to a covered work, | |
|
390 | for which you have or can give appropriate copyright permission. | |
|
391 | ||
|
392 | Notwithstanding any other provision of this License, for material you | |
|
393 | add to a covered work, you may (if authorized by the copyright holders of | |
|
394 | that material) supplement the terms of this License with terms: | |
|
395 | ||
|
396 | a) Disclaiming warranty or limiting liability differently from the | |
|
397 | terms of sections 15 and 16 of this License; or | |
|
398 | ||
|
399 | b) Requiring preservation of specified reasonable legal notices or | |
|
400 | author attributions in that material or in the Appropriate Legal | |
|
401 | Notices displayed by works containing it; or | |
|
402 | ||
|
403 | c) Prohibiting misrepresentation of the origin of that material, or | |
|
404 | requiring that modified versions of such material be marked in | |
|
405 | reasonable ways as different from the original version; or | |
|
406 | ||
|
407 | d) Limiting the use for publicity purposes of names of licensors or | |
|
408 | authors of the material; or | |
|
409 | ||
|
410 | e) Declining to grant rights under trademark law for use of some | |
|
411 | trade names, trademarks, or service marks; or | |
|
412 | ||
|
413 | f) Requiring indemnification of licensors and authors of that | |
|
414 | material by anyone who conveys the material (or modified versions of | |
|
415 | it) with contractual assumptions of liability to the recipient, for | |
|
416 | any liability that these contractual assumptions directly impose on | |
|
417 | those licensors and authors. | |
|
418 | ||
|
419 | All other non-permissive additional terms are considered "further | |
|
420 | restrictions" within the meaning of section 10. If the Program as you | |
|
421 | received it, or any part of it, contains a notice stating that it is | |
|
422 | governed by this License along with a term that is a further | |
|
423 | restriction, you may remove that term. If a license document contains | |
|
424 | a further restriction but permits relicensing or conveying under this | |
|
425 | License, you may add to a covered work material governed by the terms | |
|
426 | of that license document, provided that the further restriction does | |
|
427 | not survive such relicensing or conveying. | |
|
428 | ||
|
429 | If you add terms to a covered work in accord with this section, you | |
|
430 | must place, in the relevant source files, a statement of the | |
|
431 | additional terms that apply to those files, or a notice indicating | |
|
432 | where to find the applicable terms. | |
|
433 | ||
|
434 | Additional terms, permissive or non-permissive, may be stated in the | |
|
435 | form of a separately written license, or stated as exceptions; | |
|
436 | the above requirements apply either way. | |
|
437 | ||
|
438 | 8. Termination. | |
|
439 | ||
|
440 | You may not propagate or modify a covered work except as expressly | |
|
441 | provided under this License. Any attempt otherwise to propagate or | |
|
442 | modify it is void, and will automatically terminate your rights under | |
|
443 | this License (including any patent licenses granted under the third | |
|
444 | paragraph of section 11). | |
|
445 | ||
|
446 | However, if you cease all violation of this License, then your | |
|
447 | license from a particular copyright holder is reinstated (a) | |
|
448 | provisionally, unless and until the copyright holder explicitly and | |
|
449 | finally terminates your license, and (b) permanently, if the copyright | |
|
450 | holder fails to notify you of the violation by some reasonable means | |
|
451 | prior to 60 days after the cessation. | |
|
452 | ||
|
453 | Moreover, your license from a particular copyright holder is | |
|
454 | reinstated permanently if the copyright holder notifies you of the | |
|
455 | violation by some reasonable means, this is the first time you have | |
|
456 | received notice of violation of this License (for any work) from that | |
|
457 | copyright holder, and you cure the violation prior to 30 days after | |
|
458 | your receipt of the notice. | |
|
459 | ||
|
460 | Termination of your rights under this section does not terminate the | |
|
461 | licenses of parties who have received copies or rights from you under | |
|
462 | this License. If your rights have been terminated and not permanently | |
|
463 | reinstated, you do not qualify to receive new licenses for the same | |
|
464 | material under section 10. | |
|
465 | ||
|
466 | 9. Acceptance Not Required for Having Copies. | |
|
467 | ||
|
468 | You are not required to accept this License in order to receive or | |
|
469 | run a copy of the Program. Ancillary propagation of a covered work | |
|
470 | occurring solely as a consequence of using peer-to-peer transmission | |
|
471 | to receive a copy likewise does not require acceptance. However, | |
|
472 | nothing other than this License grants you permission to propagate or | |
|
473 | modify any covered work. These actions infringe copyright if you do | |
|
474 | not accept this License. Therefore, by modifying or propagating a | |
|
475 | covered work, you indicate your acceptance of this License to do so. | |
|
476 | ||
|
477 | 10. Automatic Licensing of Downstream Recipients. | |
|
478 | ||
|
479 | Each time you convey a covered work, the recipient automatically | |
|
480 | receives a license from the original licensors, to run, modify and | |
|
481 | propagate that work, subject to this License. You are not responsible | |
|
482 | for enforcing compliance by third parties with this License. | |
|
483 | ||
|
484 | An "entity transaction" is a transaction transferring control of an | |
|
485 | organization, or substantially all assets of one, or subdividing an | |
|
486 | organization, or merging organizations. If propagation of a covered | |
|
487 | work results from an entity transaction, each party to that | |
|
488 | transaction who receives a copy of the work also receives whatever | |
|
489 | licenses to the work the party's predecessor in interest had or could | |
|
490 | give under the previous paragraph, plus a right to possession of the | |
|
491 | Corresponding Source of the work from the predecessor in interest, if | |
|
492 | the predecessor has it or can get it with reasonable efforts. | |
|
493 | ||
|
494 | You may not impose any further restrictions on the exercise of the | |
|
495 | rights granted or affirmed under this License. For example, you may | |
|
496 | not impose a license fee, royalty, or other charge for exercise of | |
|
497 | rights granted under this License, and you may not initiate litigation | |
|
498 | (including a cross-claim or counterclaim in a lawsuit) alleging that | |
|
499 | any patent claim is infringed by making, using, selling, offering for | |
|
500 | sale, or importing the Program or any portion of it. | |
|
501 | ||
|
502 | 11. Patents. | |
|
503 | ||
|
504 | A "contributor" is a copyright holder who authorizes use under this | |
|
505 | License of the Program or a work on which the Program is based. The | |
|
506 | work thus licensed is called the contributor's "contributor version". | |
|
507 | ||
|
508 | A contributor's "essential patent claims" are all patent claims | |
|
509 | owned or controlled by the contributor, whether already acquired or | |
|
510 | hereafter acquired, that would be infringed by some manner, permitted | |
|
511 | by this License, of making, using, or selling its contributor version, | |
|
512 | but do not include claims that would be infringed only as a | |
|
513 | consequence of further modification of the contributor version. For | |
|
514 | purposes of this definition, "control" includes the right to grant | |
|
515 | patent sublicenses in a manner consistent with the requirements of | |
|
516 | this License. | |
|
517 | ||
|
518 | Each contributor grants you a non-exclusive, worldwide, royalty-free | |
|
519 | patent license under the contributor's essential patent claims, to | |
|
520 | make, use, sell, offer for sale, import and otherwise run, modify and | |
|
521 | propagate the contents of its contributor version. | |
|
522 | ||
|
523 | In the following three paragraphs, a "patent license" is any express | |
|
524 | agreement or commitment, however denominated, not to enforce a patent | |
|
525 | (such as an express permission to practice a patent or covenant not to | |
|
526 | sue for patent infringement). To "grant" such a patent license to a | |
|
527 | party means to make such an agreement or commitment not to enforce a | |
|
528 | patent against the party. | |
|
529 | ||
|
530 | If you convey a covered work, knowingly relying on a patent license, | |
|
531 | and the Corresponding Source of the work is not available for anyone | |
|
532 | to copy, free of charge and under the terms of this License, through a | |
|
533 | publicly available network server or other readily accessible means, | |
|
534 | then you must either (1) cause the Corresponding Source to be so | |
|
535 | available, or (2) arrange to deprive yourself of the benefit of the | |
|
536 | patent license for this particular work, or (3) arrange, in a manner | |
|
537 | consistent with the requirements of this License, to extend the patent | |
|
538 | license to downstream recipients. "Knowingly relying" means you have | |
|
539 | actual knowledge that, but for the patent license, your conveying the | |
|
540 | covered work in a country, or your recipient's use of the covered work | |
|
541 | in a country, would infringe one or more identifiable patents in that | |
|
542 | country that you have reason to believe are valid. | |
|
543 | ||
|
544 | If, pursuant to or in connection with a single transaction or | |
|
545 | arrangement, you convey, or propagate by procuring conveyance of, a | |
|
546 | covered work, and grant a patent license to some of the parties | |
|
547 | receiving the covered work authorizing them to use, propagate, modify | |
|
548 | or convey a specific copy of the covered work, then the patent license | |
|
549 | you grant is automatically extended to all recipients of the covered | |
|
550 | work and works based on it. | |
|
551 | ||
|
552 | A patent license is "discriminatory" if it does not include within | |
|
553 | the scope of its coverage, prohibits the exercise of, or is | |
|
554 | conditioned on the non-exercise of one or more of the rights that are | |
|
555 | specifically granted under this License. You may not convey a covered | |
|
556 | work if you are a party to an arrangement with a third party that is | |
|
557 | in the business of distributing software, under which you make payment | |
|
558 | to the third party based on the extent of your activity of conveying | |
|
559 | the work, and under which the third party grants, to any of the | |
|
560 | parties who would receive the covered work from you, a discriminatory | |
|
561 | patent license (a) in connection with copies of the covered work | |
|
562 | conveyed by you (or copies made from those copies), or (b) primarily | |
|
563 | for and in connection with specific products or compilations that | |
|
564 | contain the covered work, unless you entered into that arrangement, | |
|
565 | or that patent license was granted, prior to 28 March 2007. | |
|
566 | ||
|
567 | Nothing in this License shall be construed as excluding or limiting | |
|
568 | any implied license or other defenses to infringement that may | |
|
569 | otherwise be available to you under applicable patent law. | |
|
570 | ||
|
571 | 12. No Surrender of Others' Freedom. | |
|
572 | ||
|
573 | If conditions are imposed on you (whether by court order, agreement or | |
|
574 | otherwise) that contradict the conditions of this License, they do not | |
|
575 | excuse you from the conditions of this License. If you cannot convey a | |
|
576 | covered work so as to satisfy simultaneously your obligations under this | |
|
577 | License and any other pertinent obligations, then as a consequence you may | |
|
578 | not convey it at all. For example, if you agree to terms that obligate you | |
|
579 | to collect a royalty for further conveying from those to whom you convey | |
|
580 | the Program, the only way you could satisfy both those terms and this | |
|
581 | License would be to refrain entirely from conveying the Program. | |
|
582 | ||
|
583 | 13. Remote Network Interaction; Use with the GNU General Public License. | |
|
584 | ||
|
585 | Notwithstanding any other provision of this License, if you modify the | |
|
586 | Program, your modified version must prominently offer all users | |
|
587 | interacting with it remotely through a computer network (if your version | |
|
588 | supports such interaction) an opportunity to receive the Corresponding | |
|
589 | Source of your version by providing access to the Corresponding Source | |
|
590 | from a network server at no charge, through some standard or customary | |
|
591 | means of facilitating copying of software. This Corresponding Source | |
|
592 | shall include the Corresponding Source for any work covered by version 3 | |
|
593 | of the GNU General Public License that is incorporated pursuant to the | |
|
594 | following paragraph. | |
|
595 | ||
|
596 | Notwithstanding any other provision of this License, you have | |
|
597 | permission to link or combine any covered work with a work licensed | |
|
598 | under version 3 of the GNU General Public License into a single | |
|
599 | combined work, and to convey the resulting work. The terms of this | |
|
600 | License will continue to apply to the part which is the covered work, | |
|
601 | but the work with which it is combined will remain governed by version | |
|
602 | 3 of the GNU General Public License. | |
|
603 | ||
|
604 | 14. Revised Versions of this License. | |
|
605 | ||
|
606 | The Free Software Foundation may publish revised and/or new versions of | |
|
607 | the GNU Affero General Public License from time to time. Such new versions | |
|
608 | will be similar in spirit to the present version, but may differ in detail to | |
|
609 | address new problems or concerns. | |
|
610 | ||
|
611 | Each version is given a distinguishing version number. If the | |
|
612 | Program specifies that a certain numbered version of the GNU Affero General | |
|
613 | Public License "or any later version" applies to it, you have the | |
|
614 | option of following the terms and conditions either of that numbered | |
|
615 | version or of any later version published by the Free Software | |
|
616 | Foundation. If the Program does not specify a version number of the | |
|
617 | GNU Affero General Public License, you may choose any version ever published | |
|
618 | by the Free Software Foundation. | |
|
619 | ||
|
620 | If the Program specifies that a proxy can decide which future | |
|
621 | versions of the GNU Affero General Public License can be used, that proxy's | |
|
622 | public statement of acceptance of a version permanently authorizes you | |
|
623 | to choose that version for the Program. | |
|
624 | ||
|
625 | Later license versions may give you additional or different | |
|
626 | permissions. However, no additional obligations are imposed on any | |
|
627 | author or copyright holder as a result of your choosing to follow a | |
|
628 | later version. | |
|
629 | ||
|
630 | 15. Disclaimer of Warranty. | |
|
631 | ||
|
632 | THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY | |
|
633 | APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT | |
|
634 | HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY | |
|
635 | OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, | |
|
636 | THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR | |
|
637 | PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM | |
|
638 | IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF | |
|
639 | ALL NECESSARY SERVICING, REPAIR OR CORRECTION. | |
|
640 | ||
|
641 | 16. Limitation of Liability. | |
|
642 | ||
|
643 | IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING | |
|
644 | WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS | |
|
645 | THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY | |
|
646 | GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE | |
|
647 | USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF | |
|
648 | DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD | |
|
649 | PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), | |
|
650 | EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF | |
|
651 | SUCH DAMAGES. | |
|
652 | ||
|
653 | 17. Interpretation of Sections 15 and 16. | |
|
654 | ||
|
655 | If the disclaimer of warranty and limitation of liability provided | |
|
656 | above cannot be given local legal effect according to their terms, | |
|
657 | reviewing courts shall apply local law that most closely approximates | |
|
658 | an absolute waiver of all civil liability in connection with the | |
|
659 | Program, unless a warranty or assumption of liability accompanies a | |
|
660 | copy of the Program in return for a fee. | |
|
661 | ||
|
662 | END OF TERMS AND CONDITIONS | |
|
663 | ||
|
664 | How to Apply These Terms to Your New Programs | |
|
665 | ||
|
666 | If you develop a new program, and you want it to be of the greatest | |
|
667 | possible use to the public, the best way to achieve this is to make it | |
|
668 | free software which everyone can redistribute and change under these terms. | |
|
669 | ||
|
670 | To do so, attach the following notices to the program. It is safest | |
|
671 | to attach them to the start of each source file to most effectively | |
|
672 | state the exclusion of warranty; and each file should have at least | |
|
673 | the "copyright" line and a pointer to where the full notice is found. | |
|
674 | ||
|
675 | <one line to give the program's name and a brief idea of what it does.> | |
|
676 | Copyright (C) <year> <name of author> | |
|
677 | ||
|
678 | This program is free software: you can redistribute it and/or modify | |
|
679 | it under the terms of the GNU Affero General Public License as published by | |
|
680 | the Free Software Foundation, either version 3 of the License, or | |
|
681 | (at your option) any later version. | |
|
682 | ||
|
683 | This program is distributed in the hope that it will be useful, | |
|
684 | but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
685 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
686 | GNU Affero General Public License for more details. | |
|
687 | ||
|
688 | You should have received a copy of the GNU Affero General Public License | |
|
689 | along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
690 | ||
|
691 | Also add information on how to contact you by electronic and paper mail. | |
|
692 | ||
|
693 | If your software can interact with users remotely through a computer | |
|
694 | network, you should also make sure that it provides a way for users to | |
|
695 | get its source. For example, if your program is a web application, its | |
|
696 | interface could display a "Source" link that leads users to an archive | |
|
697 | of the code. There are many ways you could offer source, and different | |
|
698 | solutions will be better for different programs; see section 13 for the | |
|
699 | specific requirements. | |
|
700 | ||
|
701 | You should also get your employer (if you work as a programmer) or school, | |
|
702 | if any, to sign a "copyright disclaimer" for the program, if necessary. | |
|
703 | For more information on this, and how to apply and follow the GNU AGPL, see | |
|
704 | <http://www.gnu.org/licenses/>. | |
|
41 | Below is the full text of Apache License, version 2.0 | |
|
42 | ||
|
43 | ||
|
44 | Apache License | |
|
45 | Version 2.0, January 2004 | |
|
46 | http://www.apache.org/licenses/ | |
|
47 | ||
|
48 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION | |
|
49 | ||
|
50 | 1. Definitions. | |
|
51 | ||
|
52 | "License" shall mean the terms and conditions for use, reproduction, | |
|
53 | and distribution as defined by Sections 1 through 9 of this document. | |
|
54 | ||
|
55 | "Licensor" shall mean the copyright owner or entity authorized by | |
|
56 | the copyright owner that is granting the License. | |
|
57 | ||
|
58 | "Legal Entity" shall mean the union of the acting entity and all | |
|
59 | other entities that control, are controlled by, or are under common | |
|
60 | control with that entity. For the purposes of this definition, | |
|
61 | "control" means (i) the power, direct or indirect, to cause the | |
|
62 | direction or management of such entity, whether by contract or | |
|
63 | otherwise, or (ii) ownership of fifty percent (50%) or more of the | |
|
64 | outstanding shares, or (iii) beneficial ownership of such entity. | |
|
65 | ||
|
66 | "You" (or "Your") shall mean an individual or Legal Entity | |
|
67 | exercising permissions granted by this License. | |
|
68 | ||
|
69 | "Source" form shall mean the preferred form for making modifications, | |
|
70 | including but not limited to software source code, documentation | |
|
71 | source, and configuration files. | |
|
72 | ||
|
73 | "Object" form shall mean any form resulting from mechanical | |
|
74 | transformation or translation of a Source form, including but | |
|
75 | not limited to compiled object code, generated documentation, | |
|
76 | and conversions to other media types. | |
|
77 | ||
|
78 | "Work" shall mean the work of authorship, whether in Source or | |
|
79 | Object form, made available under the License, as indicated by a | |
|
80 | copyright notice that is included in or attached to the work | |
|
81 | (an example is provided in the Appendix below). | |
|
82 | ||
|
83 | "Derivative Works" shall mean any work, whether in Source or Object | |
|
84 | form, that is based on (or derived from) the Work and for which the | |
|
85 | editorial revisions, annotations, elaborations, or other modifications | |
|
86 | represent, as a whole, an original work of authorship. For the purposes | |
|
87 | of this License, Derivative Works shall not include works that remain | |
|
88 | separable from, or merely link (or bind by name) to the interfaces of, | |
|
89 | the Work and Derivative Works thereof. | |
|
90 | ||
|
91 | "Contribution" shall mean any work of authorship, including | |
|
92 | the original version of the Work and any modifications or additions | |
|
93 | to that Work or Derivative Works thereof, that is intentionally | |
|
94 | submitted to Licensor for inclusion in the Work by the copyright owner | |
|
95 | or by an individual or Legal Entity authorized to submit on behalf of | |
|
96 | the copyright owner. For the purposes of this definition, "submitted" | |
|
97 | means any form of electronic, verbal, or written communication sent | |
|
98 | to the Licensor or its representatives, including but not limited to | |
|
99 | communication on electronic mailing lists, source code control systems, | |
|
100 | and issue tracking systems that are managed by, or on behalf of, the | |
|
101 | Licensor for the purpose of discussing and improving the Work, but | |
|
102 | excluding communication that is conspicuously marked or otherwise | |
|
103 | designated in writing by the copyright owner as "Not a Contribution." | |
|
104 | ||
|
105 | "Contributor" shall mean Licensor and any individual or Legal Entity | |
|
106 | on behalf of whom a Contribution has been received by Licensor and | |
|
107 | subsequently incorporated within the Work. | |
|
108 | ||
|
109 | 2. Grant of Copyright License. Subject to the terms and conditions of | |
|
110 | this License, each Contributor hereby grants to You a perpetual, | |
|
111 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable | |
|
112 | copyright license to reproduce, prepare Derivative Works of, | |
|
113 | publicly display, publicly perform, sublicense, and distribute the | |
|
114 | Work and such Derivative Works in Source or Object form. | |
|
115 | ||
|
116 | 3. Grant of Patent License. Subject to the terms and conditions of | |
|
117 | this License, each Contributor hereby grants to You a perpetual, | |
|
118 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable | |
|
119 | (except as stated in this section) patent license to make, have made, | |
|
120 | use, offer to sell, sell, import, and otherwise transfer the Work, | |
|
121 | where such license applies only to those patent claims licensable | |
|
122 | by such Contributor that are necessarily infringed by their | |
|
123 | Contribution(s) alone or by combination of their Contribution(s) | |
|
124 | with the Work to which such Contribution(s) was submitted. If You | |
|
125 | institute patent litigation against any entity (including a | |
|
126 | cross-claim or counterclaim in a lawsuit) alleging that the Work | |
|
127 | or a Contribution incorporated within the Work constitutes direct | |
|
128 | or contributory patent infringement, then any patent licenses | |
|
129 | granted to You under this License for that Work shall terminate | |
|
130 | as of the date such litigation is filed. | |
|
131 | ||
|
132 | 4. Redistribution. You may reproduce and distribute copies of the | |
|
133 | Work or Derivative Works thereof in any medium, with or without | |
|
134 | modifications, and in Source or Object form, provided that You | |
|
135 | meet the following conditions: | |
|
136 | ||
|
137 | (a) You must give any other recipients of the Work or | |
|
138 | Derivative Works a copy of this License; and | |
|
139 | ||
|
140 | (b) You must cause any modified files to carry prominent notices | |
|
141 | stating that You changed the files; and | |
|
142 | ||
|
143 | (c) You must retain, in the Source form of any Derivative Works | |
|
144 | that You distribute, all copyright, patent, trademark, and | |
|
145 | attribution notices from the Source form of the Work, | |
|
146 | excluding those notices that do not pertain to any part of | |
|
147 | the Derivative Works; and | |
|
148 | ||
|
149 | (d) If the Work includes a "NOTICE" text file as part of its | |
|
150 | distribution, then any Derivative Works that You distribute must | |
|
151 | include a readable copy of the attribution notices contained | |
|
152 | within such NOTICE file, excluding those notices that do not | |
|
153 | pertain to any part of the Derivative Works, in at least one | |
|
154 | of the following places: within a NOTICE text file distributed | |
|
155 | as part of the Derivative Works; within the Source form or | |
|
156 | documentation, if provided along with the Derivative Works; or, | |
|
157 | within a display generated by the Derivative Works, if and | |
|
158 | wherever such third-party notices normally appear. The contents | |
|
159 | of the NOTICE file are for informational purposes only and | |
|
160 | do not modify the License. You may add Your own attribution | |
|
161 | notices within Derivative Works that You distribute, alongside | |
|
162 | or as an addendum to the NOTICE text from the Work, provided | |
|
163 | that such additional attribution notices cannot be construed | |
|
164 | as modifying the License. | |
|
165 | ||
|
166 | You may add Your own copyright statement to Your modifications and | |
|
167 | may provide additional or different license terms and conditions | |
|
168 | for use, reproduction, or distribution of Your modifications, or | |
|
169 | for any such Derivative Works as a whole, provided Your use, | |
|
170 | reproduction, and distribution of the Work otherwise complies with | |
|
171 | the conditions stated in this License. | |
|
172 | ||
|
173 | 5. Submission of Contributions. Unless You explicitly state otherwise, | |
|
174 | any Contribution intentionally submitted for inclusion in the Work | |
|
175 | by You to the Licensor shall be under the terms and conditions of | |
|
176 | this License, without any additional terms or conditions. | |
|
177 | Notwithstanding the above, nothing herein shall supersede or modify | |
|
178 | the terms of any separate license agreement you may have executed | |
|
179 | with Licensor regarding such Contributions. | |
|
180 | ||
|
181 | 6. Trademarks. This License does not grant permission to use the trade | |
|
182 | names, trademarks, service marks, or product names of the Licensor, | |
|
183 | except as required for reasonable and customary use in describing the | |
|
184 | origin of the Work and reproducing the content of the NOTICE file. | |
|
185 | ||
|
186 | 7. Disclaimer of Warranty. Unless required by applicable law or | |
|
187 | agreed to in writing, Licensor provides the Work (and each | |
|
188 | Contributor provides its Contributions) on an "AS IS" BASIS, | |
|
189 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or | |
|
190 | implied, including, without limitation, any warranties or conditions | |
|
191 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A | |
|
192 | PARTICULAR PURPOSE. You are solely responsible for determining the | |
|
193 | appropriateness of using or redistributing the Work and assume any | |
|
194 | risks associated with Your exercise of permissions under this License. | |
|
195 | ||
|
196 | 8. Limitation of Liability. In no event and under no legal theory, | |
|
197 | whether in tort (including negligence), contract, or otherwise, | |
|
198 | unless required by applicable law (such as deliberate and grossly | |
|
199 | negligent acts) or agreed to in writing, shall any Contributor be | |
|
200 | liable to You for damages, including any direct, indirect, special, | |
|
201 | incidental, or consequential damages of any character arising as a | |
|
202 | result of this License or out of the use or inability to use the | |
|
203 | Work (including but not limited to damages for loss of goodwill, | |
|
204 | work stoppage, computer failure or malfunction, or any and all | |
|
205 | other commercial damages or losses), even if such Contributor | |
|
206 | has been advised of the possibility of such damages. | |
|
207 | ||
|
208 | 9. Accepting Warranty or Additional Liability. While redistributing | |
|
209 | the Work or Derivative Works thereof, You may choose to offer, | |
|
210 | and charge a fee for, acceptance of support, warranty, indemnity, | |
|
211 | or other liability obligations and/or rights consistent with this | |
|
212 | License. However, in accepting such obligations, You may act only | |
|
213 | on Your own behalf and on Your sole responsibility, not on behalf | |
|
214 | of any other Contributor, and only if You agree to indemnify, | |
|
215 | defend, and hold each Contributor harmless for any liability | |
|
216 | incurred by, or claims asserted against, such Contributor by reason | |
|
217 | of your accepting any such warranty or additional liability. | |
|
218 | ||
|
219 | END OF TERMS AND CONDITIONS |
@@ -1,222 +1,217 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import datetime |
|
23 | 18 | import logging |
|
24 | 19 | import pyelasticsearch |
|
25 | 20 | import redis |
|
26 | 21 | import os |
|
27 | 22 | from pkg_resources import iter_entry_points |
|
28 | 23 | |
|
29 | 24 | import appenlight.lib.jinja2_filters as jinja2_filters |
|
30 | 25 | import appenlight.lib.encryption as encryption |
|
31 | 26 | |
|
32 | 27 | from pyramid.config import PHASE3_CONFIG |
|
33 | 28 | from pyramid.authentication import AuthTktAuthenticationPolicy |
|
34 | 29 | from pyramid.authorization import ACLAuthorizationPolicy |
|
35 | 30 | from pyramid_mailer.mailer import Mailer |
|
36 | 31 | from pyramid.renderers import JSON |
|
37 | 32 | from pyramid_redis_sessions import session_factory_from_settings |
|
38 | 33 | from pyramid.settings import asbool, aslist |
|
39 | 34 | from pyramid.security import AllPermissionsList |
|
40 | 35 | from pyramid_authstack import AuthenticationStackPolicy |
|
41 | 36 | from redlock import Redlock |
|
42 | 37 | from sqlalchemy import engine_from_config |
|
43 | 38 | |
|
44 | 39 | from appenlight.celery import configure_celery |
|
45 | 40 | from appenlight.lib.configurator import (CythonCompatConfigurator, |
|
46 | 41 | register_appenlight_plugin) |
|
47 | 42 | from appenlight.lib import cache_regions |
|
48 | 43 | from appenlight.lib.ext_json import json |
|
49 | 44 | from appenlight.security import groupfinder, AuthTokenAuthenticationPolicy |
|
50 | 45 | |
|
51 | 46 | __license__ = 'AGPLv3, and Commercial License' |
|
52 | 47 | __author__ = 'RhodeCode GmbH' |
|
53 | 48 | __url__ = 'http://rhodecode.com' |
|
54 | 49 | |
|
55 | 50 | json_renderer = JSON(serializer=json.dumps, indent=4) |
|
56 | 51 | |
|
57 | 52 | log = logging.getLogger(__name__) |
|
58 | 53 | |
|
59 | 54 | |
|
60 | 55 | def datetime_adapter(obj, request): |
|
61 | 56 | return obj.isoformat() |
|
62 | 57 | |
|
63 | 58 | |
|
64 | 59 | def all_permissions_adapter(obj, request): |
|
65 | 60 | return '__all_permissions__' |
|
66 | 61 | |
|
67 | 62 | |
|
68 | 63 | json_renderer.add_adapter(datetime.datetime, datetime_adapter) |
|
69 | 64 | json_renderer.add_adapter(AllPermissionsList, all_permissions_adapter) |
|
70 | 65 | |
|
71 | 66 | |
|
72 | 67 | def main(global_config, **settings): |
|
73 | 68 | """ This function returns a Pyramid WSGI application. |
|
74 | 69 | """ |
|
75 | 70 | auth_tkt_policy = AuthTktAuthenticationPolicy( |
|
76 | 71 | settings['authtkt.secret'], |
|
77 | 72 | hashalg='sha512', |
|
78 | 73 | callback=groupfinder, |
|
79 | 74 | max_age=2592000, |
|
80 | 75 | secure=asbool(settings.get('authtkt.secure', 'false'))) |
|
81 | 76 | auth_token_policy = AuthTokenAuthenticationPolicy( |
|
82 | 77 | callback=groupfinder |
|
83 | 78 | ) |
|
84 | 79 | authorization_policy = ACLAuthorizationPolicy() |
|
85 | 80 | authentication_policy = AuthenticationStackPolicy() |
|
86 | 81 | authentication_policy.add_policy('auth_tkt', auth_tkt_policy) |
|
87 | 82 | authentication_policy.add_policy('auth_token', auth_token_policy) |
|
88 | 83 | # set crypto key |
|
89 | 84 | encryption.ENCRYPTION_SECRET = settings.get('encryption_secret') |
|
90 | 85 | # import this later so encyption key can be monkeypatched |
|
91 | 86 | from appenlight.models import DBSession, register_datastores |
|
92 | 87 | # update config with cometd info |
|
93 | 88 | settings['cometd_servers'] = {'server': settings['cometd.server'], |
|
94 | 89 | 'secret': settings['cometd.secret']} |
|
95 | 90 | |
|
96 | 91 | # Create the Pyramid Configurator. |
|
97 | 92 | settings['_mail_url'] = settings['mailing.app_url'] |
|
98 | 93 | config = CythonCompatConfigurator( |
|
99 | 94 | settings=settings, |
|
100 | 95 | authentication_policy=authentication_policy, |
|
101 | 96 | authorization_policy=authorization_policy, |
|
102 | 97 | root_factory='appenlight.security.RootFactory', |
|
103 | 98 | default_permission='view') |
|
104 | 99 | # custom registry variables |
|
105 | 100 | |
|
106 | 101 | # resource type information |
|
107 | 102 | config.registry.resource_types = ['resource', 'application'] |
|
108 | 103 | # plugin information |
|
109 | 104 | config.registry.appenlight_plugins = {} |
|
110 | 105 | |
|
111 | 106 | config.set_default_csrf_options(require_csrf=True, header='X-XSRF-TOKEN') |
|
112 | 107 | config.add_view_deriver('appenlight.predicates.csrf_view', |
|
113 | 108 | name='csrf_view') |
|
114 | 109 | |
|
115 | 110 | # later, when config is available |
|
116 | 111 | dogpile_config = {'url': settings['redis.url'], |
|
117 | 112 | "redis_expiration_time": 86400, |
|
118 | 113 | "redis_distributed_lock": True} |
|
119 | 114 | cache_regions.regions = cache_regions.CacheRegions(dogpile_config) |
|
120 | 115 | config.registry.cache_regions = cache_regions.regions |
|
121 | 116 | engine = engine_from_config(settings, 'sqlalchemy.', |
|
122 | 117 | json_serializer=json.dumps) |
|
123 | 118 | DBSession.configure(bind=engine) |
|
124 | 119 | |
|
125 | 120 | # json rederer that serializes datetime |
|
126 | 121 | config.add_renderer('json', json_renderer) |
|
127 | 122 | config.set_request_property('appenlight.lib.request.es_conn', 'es_conn') |
|
128 | 123 | config.set_request_property('appenlight.lib.request.get_user', 'user', |
|
129 | 124 | reify=True) |
|
130 | 125 | config.set_request_property('appenlight.lib.request.get_csrf_token', |
|
131 | 126 | 'csrf_token', reify=True) |
|
132 | 127 | config.set_request_property('appenlight.lib.request.safe_json_body', |
|
133 | 128 | 'safe_json_body', reify=True) |
|
134 | 129 | config.set_request_property('appenlight.lib.request.unsafe_json_body', |
|
135 | 130 | 'unsafe_json_body', reify=True) |
|
136 | 131 | config.add_request_method('appenlight.lib.request.add_flash_to_headers', |
|
137 | 132 | 'add_flash_to_headers') |
|
138 | 133 | config.add_request_method('appenlight.lib.request.get_authomatic', |
|
139 | 134 | 'authomatic', reify=True) |
|
140 | 135 | |
|
141 | 136 | config.include('pyramid_redis_sessions') |
|
142 | 137 | config.include('pyramid_tm') |
|
143 | 138 | config.include('pyramid_jinja2') |
|
144 | 139 | config.include('appenlight_client.ext.pyramid_tween') |
|
145 | 140 | config.include('ziggurat_foundations.ext.pyramid.sign_in') |
|
146 | 141 | es_server_list = aslist(settings['elasticsearch.nodes']) |
|
147 | 142 | redis_url = settings['redis.url'] |
|
148 | 143 | log.warning('Elasticsearch server list: {}'.format(es_server_list)) |
|
149 | 144 | log.warning('Redis server: {}'.format(redis_url)) |
|
150 | 145 | config.registry.es_conn = pyelasticsearch.ElasticSearch(es_server_list) |
|
151 | 146 | config.registry.redis_conn = redis.StrictRedis.from_url(redis_url) |
|
152 | 147 | |
|
153 | 148 | config.registry.redis_lockmgr = Redlock([settings['redis.redlock.url'], ], |
|
154 | 149 | retry_count=0, retry_delay=0) |
|
155 | 150 | # mailer |
|
156 | 151 | config.registry.mailer = Mailer.from_settings(settings) |
|
157 | 152 | |
|
158 | 153 | # Configure sessions |
|
159 | 154 | session_factory = session_factory_from_settings(settings) |
|
160 | 155 | config.set_session_factory(session_factory) |
|
161 | 156 | |
|
162 | 157 | # Configure renderers and event subscribers |
|
163 | 158 | config.add_jinja2_extension('jinja2.ext.loopcontrols') |
|
164 | 159 | config.add_jinja2_search_path('appenlight:templates') |
|
165 | 160 | # event subscribers |
|
166 | 161 | config.add_subscriber("appenlight.subscribers.application_created", |
|
167 | 162 | "pyramid.events.ApplicationCreated") |
|
168 | 163 | config.add_subscriber("appenlight.subscribers.add_renderer_globals", |
|
169 | 164 | "pyramid.events.BeforeRender") |
|
170 | 165 | config.add_subscriber('appenlight.subscribers.new_request', |
|
171 | 166 | 'pyramid.events.NewRequest') |
|
172 | 167 | config.add_view_predicate('context_type_class', |
|
173 | 168 | 'appenlight.predicates.contextTypeClass') |
|
174 | 169 | |
|
175 | 170 | register_datastores(es_conn=config.registry.es_conn, |
|
176 | 171 | redis_conn=config.registry.redis_conn, |
|
177 | 172 | redis_lockmgr=config.registry.redis_lockmgr) |
|
178 | 173 | |
|
179 | 174 | # base stuff and scan |
|
180 | 175 | |
|
181 | 176 | # need to ensure webassets exists otherwise config.override_asset() |
|
182 | 177 | # throws exception |
|
183 | 178 | if not os.path.exists(settings['webassets.dir']): |
|
184 | 179 | os.mkdir(settings['webassets.dir']) |
|
185 | 180 | config.add_static_view(path='appenlight:webassets', |
|
186 | 181 | name='static', cache_max_age=3600) |
|
187 | 182 | config.override_asset(to_override='appenlight:webassets/', |
|
188 | 183 | override_with=settings['webassets.dir']) |
|
189 | 184 | |
|
190 | 185 | config.include('appenlight.views') |
|
191 | 186 | config.include('appenlight.views.admin') |
|
192 | 187 | config.scan(ignore=['appenlight.migrations', 'appenlight.scripts', |
|
193 | 188 | 'appenlight.tests']) |
|
194 | 189 | |
|
195 | 190 | config.add_directive('register_appenlight_plugin', |
|
196 | 191 | register_appenlight_plugin) |
|
197 | 192 | |
|
198 | 193 | for entry_point in iter_entry_points(group='appenlight.plugins'): |
|
199 | 194 | plugin = entry_point.load() |
|
200 | 195 | plugin.includeme(config) |
|
201 | 196 | |
|
202 | 197 | # include other appenlight plugins explictly if needed |
|
203 | 198 | includes = aslist(settings.get('appenlight.includes', [])) |
|
204 | 199 | for inc in includes: |
|
205 | 200 | config.include(inc) |
|
206 | 201 | |
|
207 | 202 | # run this after everything registers in configurator |
|
208 | 203 | |
|
209 | 204 | def pre_commit(): |
|
210 | 205 | jinja_env = config.get_jinja2_environment() |
|
211 | 206 | jinja_env.filters['tojson'] = json.dumps |
|
212 | 207 | jinja_env.filters['toJSONUnsafe'] = jinja2_filters.toJSONUnsafe |
|
213 | 208 | |
|
214 | 209 | config.action(None, pre_commit, order=PHASE3_CONFIG + 999) |
|
215 | 210 | |
|
216 | 211 | def wrap_config_celery(): |
|
217 | 212 | configure_celery(config.registry) |
|
218 | 213 | |
|
219 | 214 | config.action(None, wrap_config_celery, order=PHASE3_CONFIG + 999) |
|
220 | 215 | |
|
221 | 216 | app = config.make_wsgi_app() |
|
222 | 217 | return app |
@@ -1,176 +1,171 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import logging |
|
23 | 18 | |
|
24 | 19 | from datetime import timedelta |
|
25 | 20 | from celery import Celery |
|
26 | 21 | from celery.bin import Option |
|
27 | 22 | from celery.schedules import crontab |
|
28 | 23 | from celery.signals import worker_init, task_revoked, user_preload_options |
|
29 | 24 | from celery.signals import task_prerun, task_retry, task_failure, task_success |
|
30 | 25 | from kombu.serialization import register |
|
31 | 26 | from pyramid.paster import bootstrap |
|
32 | 27 | from pyramid.request import Request |
|
33 | 28 | from pyramid.scripting import prepare |
|
34 | 29 | from pyramid.settings import asbool |
|
35 | 30 | from pyramid.threadlocal import get_current_request |
|
36 | 31 | |
|
37 | 32 | from appenlight.celery.encoders import json_dumps, json_loads |
|
38 | 33 | from appenlight_client.ext.celery import register_signals |
|
39 | 34 | |
|
40 | 35 | log = logging.getLogger(__name__) |
|
41 | 36 | |
|
42 | 37 | register('date_json', json_dumps, json_loads, |
|
43 | 38 | content_type='application/x-date_json', |
|
44 | 39 | content_encoding='utf-8') |
|
45 | 40 | |
|
46 | 41 | celery = Celery() |
|
47 | 42 | |
|
48 | 43 | celery.user_options['preload'].add( |
|
49 | 44 | Option('--ini', dest='ini', default=None, |
|
50 | 45 | help='Specifies pyramid configuration file location.') |
|
51 | 46 | ) |
|
52 | 47 | |
|
53 | 48 | |
|
54 | 49 | @user_preload_options.connect |
|
55 | 50 | def on_preload_parsed(options, **kwargs): |
|
56 | 51 | """ |
|
57 | 52 | This actually configures celery from pyramid config file |
|
58 | 53 | """ |
|
59 | 54 | celery.conf['INI_PYRAMID'] = options['ini'] |
|
60 | 55 | import appenlight_client.client as e_client |
|
61 | 56 | ini_location = options['ini'] |
|
62 | 57 | if not ini_location: |
|
63 | 58 | raise Exception('You need to pass pyramid ini location using ' |
|
64 | 59 | '--ini=filename.ini argument to the worker') |
|
65 | 60 | env = bootstrap(ini_location) |
|
66 | 61 | api_key = env['request'].registry.settings['appenlight.api_key'] |
|
67 | 62 | tr_config = env['request'].registry.settings.get( |
|
68 | 63 | 'appenlight.transport_config') |
|
69 | 64 | CONFIG = e_client.get_config({'appenlight.api_key': api_key}) |
|
70 | 65 | if tr_config: |
|
71 | 66 | CONFIG['appenlight.transport_config'] = tr_config |
|
72 | 67 | APPENLIGHT_CLIENT = e_client.Client(CONFIG) |
|
73 | 68 | # log.addHandler(APPENLIGHT_CLIENT.log_handler) |
|
74 | 69 | register_signals(APPENLIGHT_CLIENT) |
|
75 | 70 | celery.pyramid = env |
|
76 | 71 | |
|
77 | 72 | |
|
78 | 73 | celery_config = { |
|
79 | 74 | 'CELERY_IMPORTS': ["appenlight.celery.tasks", ], |
|
80 | 75 | 'CELERYD_TASK_TIME_LIMIT': 60, |
|
81 | 76 | 'CELERYD_MAX_TASKS_PER_CHILD': 1000, |
|
82 | 77 | 'CELERY_IGNORE_RESULT': True, |
|
83 | 78 | 'CELERY_ACCEPT_CONTENT': ['date_json'], |
|
84 | 79 | 'CELERY_TASK_SERIALIZER': 'date_json', |
|
85 | 80 | 'CELERY_RESULT_SERIALIZER': 'date_json', |
|
86 | 81 | 'BROKER_URL': None, |
|
87 | 82 | 'CELERYD_CONCURRENCY': None, |
|
88 | 83 | 'CELERY_TIMEZONE': None, |
|
89 | 84 | 'CELERYBEAT_SCHEDULE': { |
|
90 | 85 | 'alerting_reports': { |
|
91 | 86 | 'task': 'appenlight.celery.tasks.alerting_reports', |
|
92 | 87 | 'schedule': timedelta(seconds=60) |
|
93 | 88 | }, |
|
94 | 89 | 'close_alerts': { |
|
95 | 90 | 'task': 'appenlight.celery.tasks.close_alerts', |
|
96 | 91 | 'schedule': timedelta(seconds=60) |
|
97 | 92 | } |
|
98 | 93 | } |
|
99 | 94 | } |
|
100 | 95 | celery.config_from_object(celery_config) |
|
101 | 96 | |
|
102 | 97 | |
|
103 | 98 | def configure_celery(pyramid_registry): |
|
104 | 99 | settings = pyramid_registry.settings |
|
105 | 100 | celery_config['BROKER_URL'] = settings['celery.broker_url'] |
|
106 | 101 | celery_config['CELERYD_CONCURRENCY'] = settings['celery.concurrency'] |
|
107 | 102 | celery_config['CELERY_TIMEZONE'] = settings['celery.timezone'] |
|
108 | 103 | |
|
109 | 104 | notifications_seconds = int(settings.get('tasks.notifications_reports.interval', 60)) |
|
110 | 105 | |
|
111 | 106 | celery_config['CELERYBEAT_SCHEDULE']['notifications'] = { |
|
112 | 107 | 'task': 'appenlight.celery.tasks.notifications_reports', |
|
113 | 108 | 'schedule': timedelta(seconds=notifications_seconds) |
|
114 | 109 | } |
|
115 | 110 | |
|
116 | 111 | celery_config['CELERYBEAT_SCHEDULE']['daily_digest'] = { |
|
117 | 112 | 'task': 'appenlight.celery.tasks.daily_digest', |
|
118 | 113 | 'schedule': crontab(minute=1, hour='4,12,20') |
|
119 | 114 | } |
|
120 | 115 | |
|
121 | 116 | if asbool(settings.get('celery.always_eager')): |
|
122 | 117 | celery_config['CELERY_ALWAYS_EAGER'] = True |
|
123 | 118 | celery_config['CELERY_EAGER_PROPAGATES_EXCEPTIONS'] = True |
|
124 | 119 | |
|
125 | 120 | for plugin in pyramid_registry.appenlight_plugins.values(): |
|
126 | 121 | if plugin.get('celery_tasks'): |
|
127 | 122 | celery_config['CELERY_IMPORTS'].extend(plugin['celery_tasks']) |
|
128 | 123 | if plugin.get('celery_beats'): |
|
129 | 124 | for name, config in plugin['celery_beats']: |
|
130 | 125 | celery_config['CELERYBEAT_SCHEDULE'][name] = config |
|
131 | 126 | celery.config_from_object(celery_config) |
|
132 | 127 | |
|
133 | 128 | |
|
134 | 129 | @task_prerun.connect |
|
135 | 130 | def task_prerun_signal(task_id, task, args, kwargs, **kwaargs): |
|
136 | 131 | if hasattr(celery, 'pyramid'): |
|
137 | 132 | env = celery.pyramid |
|
138 | 133 | env = prepare(registry=env['request'].registry) |
|
139 | 134 | proper_base_url = env['request'].registry.settings['mailing.app_url'] |
|
140 | 135 | tmp_req = Request.blank('/', base_url=proper_base_url) |
|
141 | 136 | # ensure tasks generate url for right domain from config |
|
142 | 137 | env['request'].environ['HTTP_HOST'] = tmp_req.environ['HTTP_HOST'] |
|
143 | 138 | env['request'].environ['SERVER_PORT'] = tmp_req.environ['SERVER_PORT'] |
|
144 | 139 | env['request'].environ['SERVER_NAME'] = tmp_req.environ['SERVER_NAME'] |
|
145 | 140 | env['request'].environ['wsgi.url_scheme'] = \ |
|
146 | 141 | tmp_req.environ['wsgi.url_scheme'] |
|
147 | 142 | get_current_request().tm.begin() |
|
148 | 143 | |
|
149 | 144 | |
|
150 | 145 | @task_success.connect |
|
151 | 146 | def task_success_signal(result, **kwargs): |
|
152 | 147 | get_current_request().tm.commit() |
|
153 | 148 | if hasattr(celery, 'pyramid'): |
|
154 | 149 | celery.pyramid["closer"]() |
|
155 | 150 | |
|
156 | 151 | |
|
157 | 152 | @task_retry.connect |
|
158 | 153 | def task_retry_signal(request, reason, einfo, **kwargs): |
|
159 | 154 | get_current_request().tm.abort() |
|
160 | 155 | if hasattr(celery, 'pyramid'): |
|
161 | 156 | celery.pyramid["closer"]() |
|
162 | 157 | |
|
163 | 158 | |
|
164 | 159 | @task_failure.connect |
|
165 | 160 | def task_failure_signal(task_id, exception, args, kwargs, traceback, einfo, |
|
166 | 161 | **kwaargs): |
|
167 | 162 | get_current_request().tm.abort() |
|
168 | 163 | if hasattr(celery, 'pyramid'): |
|
169 | 164 | celery.pyramid["closer"]() |
|
170 | 165 | |
|
171 | 166 | |
|
172 | 167 | @task_revoked.connect |
|
173 | 168 | def task_revoked_signal(request, terminated, signum, expired, **kwaargs): |
|
174 | 169 | get_current_request().tm.abort() |
|
175 | 170 | if hasattr(celery, 'pyramid'): |
|
176 | 171 | celery.pyramid["closer"]() |
@@ -1,65 +1,60 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import json |
|
23 | 18 | from datetime import datetime, date, timedelta |
|
24 | 19 | |
|
25 | 20 | DATE_FORMAT = '%Y-%m-%dT%H:%M:%S.%f' |
|
26 | 21 | |
|
27 | 22 | |
|
28 | 23 | class DateEncoder(json.JSONEncoder): |
|
29 | 24 | def default(self, obj): |
|
30 | 25 | if isinstance(obj, datetime): |
|
31 | 26 | return { |
|
32 | 27 | '__type__': '__datetime__', |
|
33 | 28 | 'iso': obj.strftime(DATE_FORMAT) |
|
34 | 29 | } |
|
35 | 30 | elif isinstance(obj, date): |
|
36 | 31 | return { |
|
37 | 32 | '__type__': '__date__', |
|
38 | 33 | 'iso': obj.strftime(DATE_FORMAT) |
|
39 | 34 | } |
|
40 | 35 | elif isinstance(obj, timedelta): |
|
41 | 36 | return { |
|
42 | 37 | '__type__': '__timedelta__', |
|
43 | 38 | 'seconds': obj.total_seconds() |
|
44 | 39 | } |
|
45 | 40 | else: |
|
46 | 41 | return json.JSONEncoder.default(self, obj) |
|
47 | 42 | |
|
48 | 43 | |
|
49 | 44 | def date_decoder(dct): |
|
50 | 45 | if '__type__' in dct: |
|
51 | 46 | if dct['__type__'] == '__datetime__': |
|
52 | 47 | return datetime.strptime(dct['iso'], DATE_FORMAT) |
|
53 | 48 | elif dct['__type__'] == '__date__': |
|
54 | 49 | return datetime.strptime(dct['iso'], DATE_FORMAT).date() |
|
55 | 50 | elif dct['__type__'] == '__timedelta__': |
|
56 | 51 | return timedelta(seconds=dct['seconds']) |
|
57 | 52 | return dct |
|
58 | 53 | |
|
59 | 54 | |
|
60 | 55 | def json_dumps(obj): |
|
61 | 56 | return json.dumps(obj, cls=DateEncoder) |
|
62 | 57 | |
|
63 | 58 | |
|
64 | 59 | def json_loads(obj): |
|
65 | 60 | return json.loads(obj.decode('utf8'), object_hook=date_decoder) |
@@ -1,665 +1,660 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import bisect |
|
23 | 18 | import collections |
|
24 | 19 | import math |
|
25 | 20 | from datetime import datetime, timedelta |
|
26 | 21 | |
|
27 | 22 | import sqlalchemy as sa |
|
28 | 23 | import pyelasticsearch |
|
29 | 24 | |
|
30 | 25 | from celery.utils.log import get_task_logger |
|
31 | 26 | from zope.sqlalchemy import mark_changed |
|
32 | 27 | from pyramid.threadlocal import get_current_request, get_current_registry |
|
33 | 28 | from appenlight.celery import celery |
|
34 | 29 | from appenlight.models.report_group import ReportGroup |
|
35 | 30 | from appenlight.models import DBSession, Datastores |
|
36 | 31 | from appenlight.models.report import Report |
|
37 | 32 | from appenlight.models.log import Log |
|
38 | 33 | from appenlight.models.metric import Metric |
|
39 | 34 | from appenlight.models.event import Event |
|
40 | 35 | |
|
41 | 36 | from appenlight.models.services.application import ApplicationService |
|
42 | 37 | from appenlight.models.services.event import EventService |
|
43 | 38 | from appenlight.models.services.log import LogService |
|
44 | 39 | from appenlight.models.services.report import ReportService |
|
45 | 40 | from appenlight.models.services.report_group import ReportGroupService |
|
46 | 41 | from appenlight.models.services.user import UserService |
|
47 | 42 | from appenlight.models.tag import Tag |
|
48 | 43 | from appenlight.lib import print_traceback |
|
49 | 44 | from appenlight.lib.utils import parse_proto, in_batches |
|
50 | 45 | from appenlight.lib.ext_json import json |
|
51 | 46 | from appenlight.lib.redis_keys import REDIS_KEYS |
|
52 | 47 | from appenlight.lib.enums import ReportType |
|
53 | 48 | |
|
54 | 49 | log = get_task_logger(__name__) |
|
55 | 50 | |
|
56 | 51 | sample_boundries = list(range(100, 1000, 100)) + \ |
|
57 | 52 | list(range(1000, 10000, 1000)) + \ |
|
58 | 53 | list(range(10000, 100000, 5000)) |
|
59 | 54 | |
|
60 | 55 | |
|
61 | 56 | def pick_sample(total_occurences, report_type=None): |
|
62 | 57 | every = 1.0 |
|
63 | 58 | position = bisect.bisect_left(sample_boundries, total_occurences) |
|
64 | 59 | if position > 0: |
|
65 | 60 | if report_type == ReportType.not_found: |
|
66 | 61 | divide = 10.0 |
|
67 | 62 | else: |
|
68 | 63 | divide = 100.0 |
|
69 | 64 | every = sample_boundries[position - 1] / divide |
|
70 | 65 | return total_occurences % every == 0 |
|
71 | 66 | |
|
72 | 67 | |
|
73 | 68 | @celery.task(queue="default", default_retry_delay=1, max_retries=2) |
|
74 | 69 | def test_exception_task(): |
|
75 | 70 | log.error('test celery log', extra={'location': 'celery'}) |
|
76 | 71 | log.warning('test celery log', extra={'location': 'celery'}) |
|
77 | 72 | raise Exception('Celery exception test') |
|
78 | 73 | |
|
79 | 74 | |
|
80 | 75 | @celery.task(queue="default", default_retry_delay=1, max_retries=2) |
|
81 | 76 | def test_retry_exception_task(): |
|
82 | 77 | try: |
|
83 | 78 | import time |
|
84 | 79 | |
|
85 | 80 | time.sleep(1.3) |
|
86 | 81 | log.error('test retry celery log', extra={'location': 'celery'}) |
|
87 | 82 | log.warning('test retry celery log', extra={'location': 'celery'}) |
|
88 | 83 | raise Exception('Celery exception test') |
|
89 | 84 | except Exception as exc: |
|
90 | 85 | test_retry_exception_task.retry(exc=exc) |
|
91 | 86 | |
|
92 | 87 | |
|
93 | 88 | @celery.task(queue="reports", default_retry_delay=600, max_retries=144) |
|
94 | 89 | def add_reports(resource_id, request_params, dataset, **kwargs): |
|
95 | 90 | proto_version = parse_proto(request_params.get('protocol_version', '')) |
|
96 | 91 | current_time = datetime.utcnow().replace(second=0, microsecond=0) |
|
97 | 92 | try: |
|
98 | 93 | # we will store solr docs here for single insert |
|
99 | 94 | es_report_docs = {} |
|
100 | 95 | es_report_group_docs = {} |
|
101 | 96 | resource = ApplicationService.by_id(resource_id) |
|
102 | 97 | |
|
103 | 98 | tags = [] |
|
104 | 99 | es_slow_calls_docs = {} |
|
105 | 100 | es_reports_stats_rows = {} |
|
106 | 101 | for report_data in dataset: |
|
107 | 102 | # build report details for later |
|
108 | 103 | added_details = 0 |
|
109 | 104 | report = Report() |
|
110 | 105 | report.set_data(report_data, resource, proto_version) |
|
111 | 106 | report._skip_ft_index = True |
|
112 | 107 | |
|
113 | 108 | # find latest group in this months partition |
|
114 | 109 | report_group = ReportGroupService.by_hash_and_resource( |
|
115 | 110 | report.resource_id, |
|
116 | 111 | report.grouping_hash, |
|
117 | 112 | since_when=datetime.utcnow().date().replace(day=1) |
|
118 | 113 | ) |
|
119 | 114 | occurences = report_data.get('occurences', 1) |
|
120 | 115 | if not report_group: |
|
121 | 116 | # total reports will be +1 moment later |
|
122 | 117 | report_group = ReportGroup(grouping_hash=report.grouping_hash, |
|
123 | 118 | occurences=0, total_reports=0, |
|
124 | 119 | last_report=0, |
|
125 | 120 | priority=report.priority, |
|
126 | 121 | error=report.error, |
|
127 | 122 | first_timestamp=report.start_time) |
|
128 | 123 | report_group._skip_ft_index = True |
|
129 | 124 | report_group.report_type = report.report_type |
|
130 | 125 | report.report_group_time = report_group.first_timestamp |
|
131 | 126 | add_sample = pick_sample(report_group.occurences, |
|
132 | 127 | report_type=report_group.report_type) |
|
133 | 128 | if add_sample: |
|
134 | 129 | resource.report_groups.append(report_group) |
|
135 | 130 | report_group.reports.append(report) |
|
136 | 131 | added_details += 1 |
|
137 | 132 | DBSession.flush() |
|
138 | 133 | if report.partition_id not in es_report_docs: |
|
139 | 134 | es_report_docs[report.partition_id] = [] |
|
140 | 135 | es_report_docs[report.partition_id].append(report.es_doc()) |
|
141 | 136 | tags.extend(list(report.tags.items())) |
|
142 | 137 | slow_calls = report.add_slow_calls(report_data, report_group) |
|
143 | 138 | DBSession.flush() |
|
144 | 139 | for s_call in slow_calls: |
|
145 | 140 | if s_call.partition_id not in es_slow_calls_docs: |
|
146 | 141 | es_slow_calls_docs[s_call.partition_id] = [] |
|
147 | 142 | es_slow_calls_docs[s_call.partition_id].append( |
|
148 | 143 | s_call.es_doc()) |
|
149 | 144 | # try generating new stat rows if needed |
|
150 | 145 | else: |
|
151 | 146 | # required for postprocessing to not fail later |
|
152 | 147 | report.report_group = report_group |
|
153 | 148 | |
|
154 | 149 | stat_row = ReportService.generate_stat_rows( |
|
155 | 150 | report, resource, report_group) |
|
156 | 151 | if stat_row.partition_id not in es_reports_stats_rows: |
|
157 | 152 | es_reports_stats_rows[stat_row.partition_id] = [] |
|
158 | 153 | es_reports_stats_rows[stat_row.partition_id].append( |
|
159 | 154 | stat_row.es_doc()) |
|
160 | 155 | |
|
161 | 156 | # see if we should mark 10th occurence of report |
|
162 | 157 | last_occurences_10 = int(math.floor(report_group.occurences / 10)) |
|
163 | 158 | curr_occurences_10 = int(math.floor( |
|
164 | 159 | (report_group.occurences + report.occurences) / 10)) |
|
165 | 160 | last_occurences_100 = int( |
|
166 | 161 | math.floor(report_group.occurences / 100)) |
|
167 | 162 | curr_occurences_100 = int(math.floor( |
|
168 | 163 | (report_group.occurences + report.occurences) / 100)) |
|
169 | 164 | notify_occurences_10 = last_occurences_10 != curr_occurences_10 |
|
170 | 165 | notify_occurences_100 = last_occurences_100 != curr_occurences_100 |
|
171 | 166 | report_group.occurences = ReportGroup.occurences + occurences |
|
172 | 167 | report_group.last_timestamp = report.start_time |
|
173 | 168 | report_group.summed_duration = ReportGroup.summed_duration + report.duration |
|
174 | 169 | summed_duration = ReportGroup.summed_duration + report.duration |
|
175 | 170 | summed_occurences = ReportGroup.occurences + occurences |
|
176 | 171 | report_group.average_duration = summed_duration / summed_occurences |
|
177 | 172 | report_group.run_postprocessing(report) |
|
178 | 173 | if added_details: |
|
179 | 174 | report_group.total_reports = ReportGroup.total_reports + 1 |
|
180 | 175 | report_group.last_report = report.id |
|
181 | 176 | report_group.set_notification_info(notify_10=notify_occurences_10, |
|
182 | 177 | notify_100=notify_occurences_100) |
|
183 | 178 | DBSession.flush() |
|
184 | 179 | report_group.get_report().notify_channel(report_group) |
|
185 | 180 | if report_group.partition_id not in es_report_group_docs: |
|
186 | 181 | es_report_group_docs[report_group.partition_id] = [] |
|
187 | 182 | es_report_group_docs[report_group.partition_id].append( |
|
188 | 183 | report_group.es_doc()) |
|
189 | 184 | |
|
190 | 185 | action = 'REPORT' |
|
191 | 186 | log_msg = '%s: %s %s, client: %s, proto: %s' % ( |
|
192 | 187 | action, |
|
193 | 188 | report_data.get('http_status', 'unknown'), |
|
194 | 189 | str(resource), |
|
195 | 190 | report_data.get('client'), |
|
196 | 191 | proto_version) |
|
197 | 192 | log.info(log_msg) |
|
198 | 193 | total_reports = len(dataset) |
|
199 | 194 | redis_pipeline = Datastores.redis.pipeline(transaction=False) |
|
200 | 195 | key = REDIS_KEYS['counters']['reports_per_minute'].format(current_time) |
|
201 | 196 | redis_pipeline.incr(key, total_reports) |
|
202 | 197 | redis_pipeline.expire(key, 3600 * 24) |
|
203 | 198 | key = REDIS_KEYS['counters']['events_per_minute_per_user'].format( |
|
204 | 199 | resource.owner_user_id, current_time) |
|
205 | 200 | redis_pipeline.incr(key, total_reports) |
|
206 | 201 | redis_pipeline.expire(key, 3600) |
|
207 | 202 | key = REDIS_KEYS['counters']['reports_per_hour_per_app'].format( |
|
208 | 203 | resource_id, current_time.replace(minute=0)) |
|
209 | 204 | redis_pipeline.incr(key, total_reports) |
|
210 | 205 | redis_pipeline.expire(key, 3600 * 24 * 7) |
|
211 | 206 | redis_pipeline.sadd( |
|
212 | 207 | REDIS_KEYS['apps_that_got_new_data_per_hour'].format( |
|
213 | 208 | current_time.replace(minute=0)), resource_id) |
|
214 | 209 | redis_pipeline.execute() |
|
215 | 210 | |
|
216 | 211 | add_reports_es(es_report_group_docs, es_report_docs) |
|
217 | 212 | add_reports_slow_calls_es(es_slow_calls_docs) |
|
218 | 213 | add_reports_stats_rows_es(es_reports_stats_rows) |
|
219 | 214 | return True |
|
220 | 215 | except Exception as exc: |
|
221 | 216 | print_traceback(log) |
|
222 | 217 | add_reports.retry(exc=exc) |
|
223 | 218 | |
|
224 | 219 | |
|
225 | 220 | @celery.task(queue="es", default_retry_delay=600, max_retries=144) |
|
226 | 221 | def add_reports_es(report_group_docs, report_docs): |
|
227 | 222 | for k, v in report_group_docs.items(): |
|
228 | 223 | Datastores.es.bulk_index(k, 'report_group', v, id_field="_id") |
|
229 | 224 | for k, v in report_docs.items(): |
|
230 | 225 | Datastores.es.bulk_index(k, 'report', v, id_field="_id", |
|
231 | 226 | parent_field='_parent') |
|
232 | 227 | |
|
233 | 228 | |
|
234 | 229 | @celery.task(queue="es", default_retry_delay=600, max_retries=144) |
|
235 | 230 | def add_reports_slow_calls_es(es_docs): |
|
236 | 231 | for k, v in es_docs.items(): |
|
237 | 232 | Datastores.es.bulk_index(k, 'log', v) |
|
238 | 233 | |
|
239 | 234 | |
|
240 | 235 | @celery.task(queue="es", default_retry_delay=600, max_retries=144) |
|
241 | 236 | def add_reports_stats_rows_es(es_docs): |
|
242 | 237 | for k, v in es_docs.items(): |
|
243 | 238 | Datastores.es.bulk_index(k, 'log', v) |
|
244 | 239 | |
|
245 | 240 | |
|
246 | 241 | @celery.task(queue="logs", default_retry_delay=600, max_retries=144) |
|
247 | 242 | def add_logs(resource_id, request_params, dataset, **kwargs): |
|
248 | 243 | proto_version = request_params.get('protocol_version') |
|
249 | 244 | current_time = datetime.utcnow().replace(second=0, microsecond=0) |
|
250 | 245 | |
|
251 | 246 | try: |
|
252 | 247 | es_docs = collections.defaultdict(list) |
|
253 | 248 | resource = ApplicationService.by_id_cached()(resource_id) |
|
254 | 249 | resource = DBSession.merge(resource, load=False) |
|
255 | 250 | ns_pairs = [] |
|
256 | 251 | for entry in dataset: |
|
257 | 252 | # gather pk and ns so we can remove older versions of row later |
|
258 | 253 | if entry['primary_key'] is not None: |
|
259 | 254 | ns_pairs.append({"pk": entry['primary_key'], |
|
260 | 255 | "ns": entry['namespace']}) |
|
261 | 256 | log_entry = Log() |
|
262 | 257 | log_entry.set_data(entry, resource=resource) |
|
263 | 258 | log_entry._skip_ft_index = True |
|
264 | 259 | resource.logs.append(log_entry) |
|
265 | 260 | DBSession.flush() |
|
266 | 261 | # insert non pk rows first |
|
267 | 262 | if entry['primary_key'] is None: |
|
268 | 263 | es_docs[log_entry.partition_id].append(log_entry.es_doc()) |
|
269 | 264 | |
|
270 | 265 | # 2nd pass to delete all log entries from db foe same pk/ns pair |
|
271 | 266 | if ns_pairs: |
|
272 | 267 | ids_to_delete = [] |
|
273 | 268 | es_docs = collections.defaultdict(list) |
|
274 | 269 | es_docs_to_delete = collections.defaultdict(list) |
|
275 | 270 | found_pkey_logs = LogService.query_by_primary_key_and_namespace( |
|
276 | 271 | list_of_pairs=ns_pairs) |
|
277 | 272 | log_dict = {} |
|
278 | 273 | for log_entry in found_pkey_logs: |
|
279 | 274 | log_key = (log_entry.primary_key, log_entry.namespace) |
|
280 | 275 | if log_key not in log_dict: |
|
281 | 276 | log_dict[log_key] = [] |
|
282 | 277 | log_dict[log_key].append(log_entry) |
|
283 | 278 | |
|
284 | 279 | for ns, entry_list in log_dict.items(): |
|
285 | 280 | entry_list = sorted(entry_list, key=lambda x: x.timestamp) |
|
286 | 281 | # newest row needs to be indexed in es |
|
287 | 282 | log_entry = entry_list[-1] |
|
288 | 283 | # delete everything from pg and ES, leave the last row in pg |
|
289 | 284 | for e in entry_list[:-1]: |
|
290 | 285 | ids_to_delete.append(e.log_id) |
|
291 | 286 | es_docs_to_delete[e.partition_id].append(e.delete_hash) |
|
292 | 287 | |
|
293 | 288 | es_docs_to_delete[log_entry.partition_id].append( |
|
294 | 289 | log_entry.delete_hash) |
|
295 | 290 | |
|
296 | 291 | es_docs[log_entry.partition_id].append(log_entry.es_doc()) |
|
297 | 292 | |
|
298 | 293 | if ids_to_delete: |
|
299 | 294 | query = DBSession.query(Log).filter( |
|
300 | 295 | Log.log_id.in_(ids_to_delete)) |
|
301 | 296 | query.delete(synchronize_session=False) |
|
302 | 297 | if es_docs_to_delete: |
|
303 | 298 | # batch this to avoid problems with default ES bulk limits |
|
304 | 299 | for es_index in es_docs_to_delete.keys(): |
|
305 | 300 | for batch in in_batches(es_docs_to_delete[es_index], 20): |
|
306 | 301 | query = {'terms': {'delete_hash': batch}} |
|
307 | 302 | |
|
308 | 303 | try: |
|
309 | 304 | Datastores.es.delete_by_query( |
|
310 | 305 | es_index, 'log', query) |
|
311 | 306 | except pyelasticsearch.ElasticHttpNotFoundError as exc: |
|
312 | 307 | msg = 'skipping index {}'.format(es_index) |
|
313 | 308 | log.info(msg) |
|
314 | 309 | |
|
315 | 310 | total_logs = len(dataset) |
|
316 | 311 | |
|
317 | 312 | log_msg = 'LOG_NEW: %s, entries: %s, proto:%s' % ( |
|
318 | 313 | str(resource), |
|
319 | 314 | total_logs, |
|
320 | 315 | proto_version) |
|
321 | 316 | log.info(log_msg) |
|
322 | 317 | # mark_changed(session) |
|
323 | 318 | redis_pipeline = Datastores.redis.pipeline(transaction=False) |
|
324 | 319 | key = REDIS_KEYS['counters']['logs_per_minute'].format(current_time) |
|
325 | 320 | redis_pipeline.incr(key, total_logs) |
|
326 | 321 | redis_pipeline.expire(key, 3600 * 24) |
|
327 | 322 | key = REDIS_KEYS['counters']['events_per_minute_per_user'].format( |
|
328 | 323 | resource.owner_user_id, current_time) |
|
329 | 324 | redis_pipeline.incr(key, total_logs) |
|
330 | 325 | redis_pipeline.expire(key, 3600) |
|
331 | 326 | key = REDIS_KEYS['counters']['logs_per_hour_per_app'].format( |
|
332 | 327 | resource_id, current_time.replace(minute=0)) |
|
333 | 328 | redis_pipeline.incr(key, total_logs) |
|
334 | 329 | redis_pipeline.expire(key, 3600 * 24 * 7) |
|
335 | 330 | redis_pipeline.sadd( |
|
336 | 331 | REDIS_KEYS['apps_that_got_new_data_per_hour'].format( |
|
337 | 332 | current_time.replace(minute=0)), resource_id) |
|
338 | 333 | redis_pipeline.execute() |
|
339 | 334 | add_logs_es(es_docs) |
|
340 | 335 | return True |
|
341 | 336 | except Exception as exc: |
|
342 | 337 | print_traceback(log) |
|
343 | 338 | add_logs.retry(exc=exc) |
|
344 | 339 | |
|
345 | 340 | |
|
346 | 341 | @celery.task(queue="es", default_retry_delay=600, max_retries=144) |
|
347 | 342 | def add_logs_es(es_docs): |
|
348 | 343 | for k, v in es_docs.items(): |
|
349 | 344 | Datastores.es.bulk_index(k, 'log', v) |
|
350 | 345 | |
|
351 | 346 | |
|
352 | 347 | @celery.task(queue="metrics", default_retry_delay=600, max_retries=144) |
|
353 | 348 | def add_metrics(resource_id, request_params, dataset, proto_version): |
|
354 | 349 | current_time = datetime.utcnow().replace(second=0, microsecond=0) |
|
355 | 350 | try: |
|
356 | 351 | resource = ApplicationService.by_id_cached()(resource_id) |
|
357 | 352 | resource = DBSession.merge(resource, load=False) |
|
358 | 353 | es_docs = [] |
|
359 | 354 | rows = [] |
|
360 | 355 | for metric in dataset: |
|
361 | 356 | tags = dict(metric['tags']) |
|
362 | 357 | server_n = tags.get('server_name', metric['server_name']).lower() |
|
363 | 358 | tags['server_name'] = server_n or 'unknown' |
|
364 | 359 | new_metric = Metric( |
|
365 | 360 | timestamp=metric['timestamp'], |
|
366 | 361 | resource_id=resource.resource_id, |
|
367 | 362 | namespace=metric['namespace'], |
|
368 | 363 | tags=tags) |
|
369 | 364 | rows.append(new_metric) |
|
370 | 365 | es_docs.append(new_metric.es_doc()) |
|
371 | 366 | session = DBSession() |
|
372 | 367 | session.bulk_save_objects(rows) |
|
373 | 368 | session.flush() |
|
374 | 369 | |
|
375 | 370 | action = 'METRICS' |
|
376 | 371 | metrics_msg = '%s: %s, metrics: %s, proto:%s' % ( |
|
377 | 372 | action, |
|
378 | 373 | str(resource), |
|
379 | 374 | len(dataset), |
|
380 | 375 | proto_version |
|
381 | 376 | ) |
|
382 | 377 | log.info(metrics_msg) |
|
383 | 378 | |
|
384 | 379 | mark_changed(session) |
|
385 | 380 | redis_pipeline = Datastores.redis.pipeline(transaction=False) |
|
386 | 381 | key = REDIS_KEYS['counters']['metrics_per_minute'].format(current_time) |
|
387 | 382 | redis_pipeline.incr(key, len(rows)) |
|
388 | 383 | redis_pipeline.expire(key, 3600 * 24) |
|
389 | 384 | key = REDIS_KEYS['counters']['events_per_minute_per_user'].format( |
|
390 | 385 | resource.owner_user_id, current_time) |
|
391 | 386 | redis_pipeline.incr(key, len(rows)) |
|
392 | 387 | redis_pipeline.expire(key, 3600) |
|
393 | 388 | key = REDIS_KEYS['counters']['metrics_per_hour_per_app'].format( |
|
394 | 389 | resource_id, current_time.replace(minute=0)) |
|
395 | 390 | redis_pipeline.incr(key, len(rows)) |
|
396 | 391 | redis_pipeline.expire(key, 3600 * 24 * 7) |
|
397 | 392 | redis_pipeline.sadd( |
|
398 | 393 | REDIS_KEYS['apps_that_got_new_data_per_hour'].format( |
|
399 | 394 | current_time.replace(minute=0)), resource_id) |
|
400 | 395 | redis_pipeline.execute() |
|
401 | 396 | add_metrics_es(es_docs) |
|
402 | 397 | return True |
|
403 | 398 | except Exception as exc: |
|
404 | 399 | print_traceback(log) |
|
405 | 400 | add_metrics.retry(exc=exc) |
|
406 | 401 | |
|
407 | 402 | |
|
408 | 403 | @celery.task(queue="es", default_retry_delay=600, max_retries=144) |
|
409 | 404 | def add_metrics_es(es_docs): |
|
410 | 405 | for doc in es_docs: |
|
411 | 406 | partition = 'rcae_m_%s' % doc['timestamp'].strftime('%Y_%m_%d') |
|
412 | 407 | Datastores.es.index(partition, 'log', doc) |
|
413 | 408 | |
|
414 | 409 | |
|
415 | 410 | @celery.task(queue="default", default_retry_delay=5, max_retries=2) |
|
416 | 411 | def check_user_report_notifications(resource_id): |
|
417 | 412 | since_when = datetime.utcnow() |
|
418 | 413 | try: |
|
419 | 414 | request = get_current_request() |
|
420 | 415 | application = ApplicationService.by_id(resource_id) |
|
421 | 416 | if not application: |
|
422 | 417 | return |
|
423 | 418 | error_key = REDIS_KEYS['reports_to_notify_per_type_per_app'].format( |
|
424 | 419 | ReportType.error, resource_id) |
|
425 | 420 | slow_key = REDIS_KEYS['reports_to_notify_per_type_per_app'].format( |
|
426 | 421 | ReportType.slow, resource_id) |
|
427 | 422 | error_group_ids = Datastores.redis.smembers(error_key) |
|
428 | 423 | slow_group_ids = Datastores.redis.smembers(slow_key) |
|
429 | 424 | Datastores.redis.delete(error_key) |
|
430 | 425 | Datastores.redis.delete(slow_key) |
|
431 | 426 | err_gids = [int(g_id) for g_id in error_group_ids] |
|
432 | 427 | slow_gids = [int(g_id) for g_id in list(slow_group_ids)] |
|
433 | 428 | group_ids = err_gids + slow_gids |
|
434 | 429 | occurence_dict = {} |
|
435 | 430 | for g_id in group_ids: |
|
436 | 431 | key = REDIS_KEYS['counters']['report_group_occurences'].format( |
|
437 | 432 | g_id) |
|
438 | 433 | val = Datastores.redis.get(key) |
|
439 | 434 | Datastores.redis.delete(key) |
|
440 | 435 | if val: |
|
441 | 436 | occurence_dict[g_id] = int(val) |
|
442 | 437 | else: |
|
443 | 438 | occurence_dict[g_id] = 1 |
|
444 | 439 | report_groups = ReportGroupService.by_ids(group_ids) |
|
445 | 440 | report_groups.options(sa.orm.joinedload(ReportGroup.last_report_ref)) |
|
446 | 441 | |
|
447 | 442 | ApplicationService.check_for_groups_alert( |
|
448 | 443 | application, 'alert', report_groups=report_groups, |
|
449 | 444 | occurence_dict=occurence_dict) |
|
450 | 445 | users = set([p.user for p in application.users_for_perm('view')]) |
|
451 | 446 | report_groups = report_groups.all() |
|
452 | 447 | for user in users: |
|
453 | 448 | UserService.report_notify(user, request, application, |
|
454 | 449 | report_groups=report_groups, |
|
455 | 450 | occurence_dict=occurence_dict) |
|
456 | 451 | for group in report_groups: |
|
457 | 452 | # marks report_groups as notified |
|
458 | 453 | if not group.notified: |
|
459 | 454 | group.notified = True |
|
460 | 455 | except Exception as exc: |
|
461 | 456 | print_traceback(log) |
|
462 | 457 | raise |
|
463 | 458 | |
|
464 | 459 | |
|
465 | 460 | @celery.task(queue="default", default_retry_delay=5, max_retries=2) |
|
466 | 461 | def check_alerts(resource_id): |
|
467 | 462 | since_when = datetime.utcnow() |
|
468 | 463 | try: |
|
469 | 464 | request = get_current_request() |
|
470 | 465 | application = ApplicationService.by_id(resource_id) |
|
471 | 466 | if not application: |
|
472 | 467 | return |
|
473 | 468 | error_key = REDIS_KEYS[ |
|
474 | 469 | 'reports_to_notify_per_type_per_app_alerting'].format( |
|
475 | 470 | ReportType.error, resource_id) |
|
476 | 471 | slow_key = REDIS_KEYS[ |
|
477 | 472 | 'reports_to_notify_per_type_per_app_alerting'].format( |
|
478 | 473 | ReportType.slow, resource_id) |
|
479 | 474 | error_group_ids = Datastores.redis.smembers(error_key) |
|
480 | 475 | slow_group_ids = Datastores.redis.smembers(slow_key) |
|
481 | 476 | Datastores.redis.delete(error_key) |
|
482 | 477 | Datastores.redis.delete(slow_key) |
|
483 | 478 | err_gids = [int(g_id) for g_id in error_group_ids] |
|
484 | 479 | slow_gids = [int(g_id) for g_id in list(slow_group_ids)] |
|
485 | 480 | group_ids = err_gids + slow_gids |
|
486 | 481 | occurence_dict = {} |
|
487 | 482 | for g_id in group_ids: |
|
488 | 483 | key = REDIS_KEYS['counters'][ |
|
489 | 484 | 'report_group_occurences_alerting'].format( |
|
490 | 485 | g_id) |
|
491 | 486 | val = Datastores.redis.get(key) |
|
492 | 487 | Datastores.redis.delete(key) |
|
493 | 488 | if val: |
|
494 | 489 | occurence_dict[g_id] = int(val) |
|
495 | 490 | else: |
|
496 | 491 | occurence_dict[g_id] = 1 |
|
497 | 492 | report_groups = ReportGroupService.by_ids(group_ids) |
|
498 | 493 | report_groups.options(sa.orm.joinedload(ReportGroup.last_report_ref)) |
|
499 | 494 | |
|
500 | 495 | ApplicationService.check_for_groups_alert( |
|
501 | 496 | application, 'alert', report_groups=report_groups, |
|
502 | 497 | occurence_dict=occurence_dict, since_when=since_when) |
|
503 | 498 | except Exception as exc: |
|
504 | 499 | print_traceback(log) |
|
505 | 500 | raise |
|
506 | 501 | |
|
507 | 502 | |
|
508 | 503 | @celery.task(queue="default", default_retry_delay=1, max_retries=2) |
|
509 | 504 | def close_alerts(): |
|
510 | 505 | log.warning('Checking alerts') |
|
511 | 506 | since_when = datetime.utcnow() |
|
512 | 507 | try: |
|
513 | 508 | event_types = [Event.types['error_report_alert'], |
|
514 | 509 | Event.types['slow_report_alert'], ] |
|
515 | 510 | statuses = [Event.statuses['active']] |
|
516 | 511 | # get events older than 5 min |
|
517 | 512 | events = EventService.by_type_and_status( |
|
518 | 513 | event_types, |
|
519 | 514 | statuses, |
|
520 | 515 | older_than=(since_when - timedelta(minutes=5))) |
|
521 | 516 | for event in events: |
|
522 | 517 | # see if we can close them |
|
523 | 518 | event.validate_or_close( |
|
524 | 519 | since_when=(since_when - timedelta(minutes=1))) |
|
525 | 520 | except Exception as exc: |
|
526 | 521 | print_traceback(log) |
|
527 | 522 | raise |
|
528 | 523 | |
|
529 | 524 | |
|
530 | 525 | @celery.task(queue="default", default_retry_delay=600, max_retries=144) |
|
531 | 526 | def update_tag_counter(tag_name, tag_value, count): |
|
532 | 527 | try: |
|
533 | 528 | query = DBSession.query(Tag).filter(Tag.name == tag_name).filter( |
|
534 | 529 | sa.cast(Tag.value, sa.types.TEXT) == sa.cast(json.dumps(tag_value), |
|
535 | 530 | sa.types.TEXT)) |
|
536 | 531 | query.update({'times_seen': Tag.times_seen + count, |
|
537 | 532 | 'last_timestamp': datetime.utcnow()}, |
|
538 | 533 | synchronize_session=False) |
|
539 | 534 | session = DBSession() |
|
540 | 535 | mark_changed(session) |
|
541 | 536 | return True |
|
542 | 537 | except Exception as exc: |
|
543 | 538 | print_traceback(log) |
|
544 | 539 | update_tag_counter.retry(exc=exc) |
|
545 | 540 | |
|
546 | 541 | |
|
547 | 542 | @celery.task(queue="default") |
|
548 | 543 | def update_tag_counters(): |
|
549 | 544 | """ |
|
550 | 545 | Sets task to update counters for application tags |
|
551 | 546 | """ |
|
552 | 547 | tags = Datastores.redis.lrange(REDIS_KEYS['seen_tag_list'], 0, -1) |
|
553 | 548 | Datastores.redis.delete(REDIS_KEYS['seen_tag_list']) |
|
554 | 549 | c = collections.Counter(tags) |
|
555 | 550 | for t_json, count in c.items(): |
|
556 | 551 | tag_info = json.loads(t_json) |
|
557 | 552 | update_tag_counter.delay(tag_info[0], tag_info[1], count) |
|
558 | 553 | |
|
559 | 554 | |
|
560 | 555 | @celery.task(queue="default") |
|
561 | 556 | def daily_digest(): |
|
562 | 557 | """ |
|
563 | 558 | Sends daily digest with top 50 error reports |
|
564 | 559 | """ |
|
565 | 560 | request = get_current_request() |
|
566 | 561 | apps = Datastores.redis.smembers(REDIS_KEYS['apps_that_had_reports']) |
|
567 | 562 | Datastores.redis.delete(REDIS_KEYS['apps_that_had_reports']) |
|
568 | 563 | since_when = datetime.utcnow() - timedelta(hours=8) |
|
569 | 564 | log.warning('Generating daily digests') |
|
570 | 565 | for resource_id in apps: |
|
571 | 566 | resource_id = resource_id.decode('utf8') |
|
572 | 567 | end_date = datetime.utcnow().replace(microsecond=0, second=0) |
|
573 | 568 | filter_settings = {'resource': [resource_id], |
|
574 | 569 | 'tags': [{'name': 'type', |
|
575 | 570 | 'value': ['error'], 'op': None}], |
|
576 | 571 | 'type': 'error', 'start_date': since_when, |
|
577 | 572 | 'end_date': end_date} |
|
578 | 573 | |
|
579 | 574 | reports = ReportGroupService.get_trending( |
|
580 | 575 | request, filter_settings=filter_settings, limit=50) |
|
581 | 576 | |
|
582 | 577 | application = ApplicationService.by_id(resource_id) |
|
583 | 578 | if application: |
|
584 | 579 | users = set([p.user for p in application.users_for_perm('view')]) |
|
585 | 580 | for user in users: |
|
586 | 581 | user.send_digest(request, application, reports=reports, |
|
587 | 582 | since_when=since_when) |
|
588 | 583 | |
|
589 | 584 | |
|
590 | 585 | @celery.task(queue="default") |
|
591 | 586 | def notifications_reports(): |
|
592 | 587 | """ |
|
593 | 588 | Loop that checks redis for info and then issues new tasks to celery to |
|
594 | 589 | issue notifications |
|
595 | 590 | """ |
|
596 | 591 | apps = Datastores.redis.smembers(REDIS_KEYS['apps_that_had_reports']) |
|
597 | 592 | Datastores.redis.delete(REDIS_KEYS['apps_that_had_reports']) |
|
598 | 593 | for app in apps: |
|
599 | 594 | log.warning('Notify for app: %s' % app) |
|
600 | 595 | check_user_report_notifications.delay(app.decode('utf8')) |
|
601 | 596 | |
|
602 | 597 | @celery.task(queue="default") |
|
603 | 598 | def alerting_reports(): |
|
604 | 599 | """ |
|
605 | 600 | Loop that checks redis for info and then issues new tasks to celery to |
|
606 | 601 | perform the following: |
|
607 | 602 | - which applications should have new alerts opened |
|
608 | 603 | """ |
|
609 | 604 | |
|
610 | 605 | apps = Datastores.redis.smembers(REDIS_KEYS['apps_that_had_reports_alerting']) |
|
611 | 606 | Datastores.redis.delete(REDIS_KEYS['apps_that_had_reports_alerting']) |
|
612 | 607 | for app in apps: |
|
613 | 608 | log.warning('Notify for app: %s' % app) |
|
614 | 609 | check_alerts.delay(app.decode('utf8')) |
|
615 | 610 | |
|
616 | 611 | |
|
617 | 612 | @celery.task(queue="default", soft_time_limit=3600 * 4, |
|
618 | 613 | hard_time_limit=3600 * 4, max_retries=144) |
|
619 | 614 | def logs_cleanup(resource_id, filter_settings): |
|
620 | 615 | request = get_current_request() |
|
621 | 616 | request.tm.begin() |
|
622 | 617 | es_query = { |
|
623 | 618 | "_source": False, |
|
624 | 619 | "size": 5000, |
|
625 | 620 | "query": { |
|
626 | 621 | "filtered": { |
|
627 | 622 | "filter": { |
|
628 | 623 | "and": [{"term": {"resource_id": resource_id}}] |
|
629 | 624 | } |
|
630 | 625 | } |
|
631 | 626 | } |
|
632 | 627 | } |
|
633 | 628 | |
|
634 | 629 | query = DBSession.query(Log).filter(Log.resource_id == resource_id) |
|
635 | 630 | if filter_settings['namespace']: |
|
636 | 631 | query = query.filter(Log.namespace == filter_settings['namespace'][0]) |
|
637 | 632 | es_query['query']['filtered']['filter']['and'].append( |
|
638 | 633 | {"term": {"namespace": filter_settings['namespace'][0]}} |
|
639 | 634 | ) |
|
640 | 635 | query.delete(synchronize_session=False) |
|
641 | 636 | request.tm.commit() |
|
642 | 637 | result = request.es_conn.search(es_query, index='rcae_l_*', |
|
643 | 638 | doc_type='log', es_scroll='1m', |
|
644 | 639 | es_search_type='scan') |
|
645 | 640 | scroll_id = result['_scroll_id'] |
|
646 | 641 | while True: |
|
647 | 642 | log.warning('log_cleanup, app:{} ns:{} batch'.format( |
|
648 | 643 | resource_id, |
|
649 | 644 | filter_settings['namespace'] |
|
650 | 645 | )) |
|
651 | 646 | es_docs_to_delete = [] |
|
652 | 647 | result = request.es_conn.send_request( |
|
653 | 648 | 'POST', ['_search', 'scroll'], |
|
654 | 649 | body=scroll_id, query_params={"scroll": '1m'}) |
|
655 | 650 | scroll_id = result['_scroll_id'] |
|
656 | 651 | if not result['hits']['hits']: |
|
657 | 652 | break |
|
658 | 653 | for doc in result['hits']['hits']: |
|
659 | 654 | es_docs_to_delete.append({"id": doc['_id'], |
|
660 | 655 | "index": doc['_index']}) |
|
661 | 656 | |
|
662 | 657 | for batch in in_batches(es_docs_to_delete, 10): |
|
663 | 658 | Datastores.es.bulk([Datastores.es.delete_op(doc_type='log', |
|
664 | 659 | **to_del) |
|
665 | 660 | for to_del in batch]) |
@@ -1,24 +1,19 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | def filter_callable(structure, section=None): |
|
23 | 18 | structure['SOMEVAL'] = '***REMOVED***' |
|
24 | 19 | return structure |
@@ -1,901 +1,896 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import wtforms |
|
23 | 18 | import formencode |
|
24 | 19 | import re |
|
25 | 20 | import pyramid.threadlocal |
|
26 | 21 | import datetime |
|
27 | 22 | import appenlight.lib.helpers as h |
|
28 | 23 | |
|
29 | 24 | from appenlight.models.user import User |
|
30 | 25 | from appenlight.models.group import Group |
|
31 | 26 | from appenlight.models import DBSession |
|
32 | 27 | from appenlight.models.alert_channel import AlertChannel |
|
33 | 28 | from appenlight.models.integrations import IntegrationException |
|
34 | 29 | from appenlight.models.integrations.campfire import CampfireIntegration |
|
35 | 30 | from appenlight.models.integrations.bitbucket import BitbucketIntegration |
|
36 | 31 | from appenlight.models.integrations.github import GithubIntegration |
|
37 | 32 | from appenlight.models.integrations.flowdock import FlowdockIntegration |
|
38 | 33 | from appenlight.models.integrations.hipchat import HipchatIntegration |
|
39 | 34 | from appenlight.models.integrations.jira import JiraClient |
|
40 | 35 | from appenlight.models.integrations.slack import SlackIntegration |
|
41 | 36 | from appenlight.lib.ext_json import json |
|
42 | 37 | from wtforms.ext.csrf.form import SecureForm |
|
43 | 38 | from wtforms.compat import iteritems |
|
44 | 39 | from collections import defaultdict |
|
45 | 40 | |
|
46 | 41 | _ = str |
|
47 | 42 | |
|
48 | 43 | strip_filter = lambda x: x.strip() if x else None |
|
49 | 44 | uppercase_filter = lambda x: x.upper() if x else None |
|
50 | 45 | |
|
51 | 46 | FALSE_VALUES = ('false', '', False, None) |
|
52 | 47 | |
|
53 | 48 | |
|
54 | 49 | class CSRFException(Exception): |
|
55 | 50 | pass |
|
56 | 51 | |
|
57 | 52 | |
|
58 | 53 | class ReactorForm(SecureForm): |
|
59 | 54 | def __init__(self, formdata=None, obj=None, prefix='', csrf_context=None, |
|
60 | 55 | **kwargs): |
|
61 | 56 | super(ReactorForm, self).__init__(formdata=formdata, obj=obj, |
|
62 | 57 | prefix=prefix, |
|
63 | 58 | csrf_context=csrf_context, **kwargs) |
|
64 | 59 | self._csrf_context = csrf_context |
|
65 | 60 | |
|
66 | 61 | def generate_csrf_token(self, csrf_context): |
|
67 | 62 | return csrf_context.session.get_csrf_token() |
|
68 | 63 | |
|
69 | 64 | def validate_csrf_token(self, field): |
|
70 | 65 | request = self._csrf_context or pyramid.threadlocal.get_current_request() |
|
71 | 66 | is_from_auth_token = 'auth:auth_token' in request.effective_principals |
|
72 | 67 | if is_from_auth_token: |
|
73 | 68 | return True |
|
74 | 69 | |
|
75 | 70 | if field.data != field.current_token: |
|
76 | 71 | # try to save the day by using token from angular |
|
77 | 72 | if request.headers.get('X-XSRF-TOKEN') != field.current_token: |
|
78 | 73 | raise CSRFException('Invalid CSRF token') |
|
79 | 74 | |
|
80 | 75 | @property |
|
81 | 76 | def errors_dict(self): |
|
82 | 77 | r_dict = defaultdict(list) |
|
83 | 78 | for k, errors in self.errors.items(): |
|
84 | 79 | r_dict[k].extend([str(e) for e in errors]) |
|
85 | 80 | return r_dict |
|
86 | 81 | |
|
87 | 82 | @property |
|
88 | 83 | def errors_json(self): |
|
89 | 84 | return json.dumps(self.errors_dict) |
|
90 | 85 | |
|
91 | 86 | def populate_obj(self, obj, ignore_none=False): |
|
92 | 87 | """ |
|
93 | 88 | Populates the attributes of the passed `obj` with data from the form's |
|
94 | 89 | fields. |
|
95 | 90 | |
|
96 | 91 | :note: This is a destructive operation; Any attribute with the same name |
|
97 | 92 | as a field will be overridden. Use with caution. |
|
98 | 93 | """ |
|
99 | 94 | if ignore_none: |
|
100 | 95 | for name, field in iteritems(self._fields): |
|
101 | 96 | if field.data is not None: |
|
102 | 97 | field.populate_obj(obj, name) |
|
103 | 98 | else: |
|
104 | 99 | for name, field in iteritems(self._fields): |
|
105 | 100 | field.populate_obj(obj, name) |
|
106 | 101 | |
|
107 | 102 | css_classes = {} |
|
108 | 103 | ignore_labels = {} |
|
109 | 104 | |
|
110 | 105 | |
|
111 | 106 | class SignInForm(ReactorForm): |
|
112 | 107 | came_from = wtforms.HiddenField() |
|
113 | 108 | sign_in_user_name = wtforms.StringField(_('User Name')) |
|
114 | 109 | sign_in_user_password = wtforms.PasswordField(_('Password')) |
|
115 | 110 | |
|
116 | 111 | ignore_labels = ['submit'] |
|
117 | 112 | css_classes = {'submit': 'btn btn-primary'} |
|
118 | 113 | |
|
119 | 114 | html_attrs = {'sign_in_user_name': {'placeholder': 'Your login'}, |
|
120 | 115 | 'sign_in_user_password': { |
|
121 | 116 | 'placeholder': 'Your password'}} |
|
122 | 117 | |
|
123 | 118 | |
|
124 | 119 | from wtforms.widgets import html_params, HTMLString |
|
125 | 120 | |
|
126 | 121 | |
|
127 | 122 | def select_multi_checkbox(field, ul_class='set', **kwargs): |
|
128 | 123 | """Render a multi-checkbox widget""" |
|
129 | 124 | kwargs.setdefault('type', 'checkbox') |
|
130 | 125 | field_id = kwargs.pop('id', field.id) |
|
131 | 126 | html = ['<ul %s>' % html_params(id=field_id, class_=ul_class)] |
|
132 | 127 | for value, label, checked in field.iter_choices(): |
|
133 | 128 | choice_id = '%s-%s' % (field_id, value) |
|
134 | 129 | options = dict(kwargs, name=field.name, value=value, id=choice_id) |
|
135 | 130 | if checked: |
|
136 | 131 | options['checked'] = 'checked' |
|
137 | 132 | html.append('<li><input %s /> ' % html_params(**options)) |
|
138 | 133 | html.append('<label for="%s">%s</label></li>' % (choice_id, label)) |
|
139 | 134 | html.append('</ul>') |
|
140 | 135 | return HTMLString(''.join(html)) |
|
141 | 136 | |
|
142 | 137 | |
|
143 | 138 | def button_widget(field, button_cls='ButtonField btn btn-default', **kwargs): |
|
144 | 139 | """Render a button widget""" |
|
145 | 140 | kwargs.setdefault('type', 'button') |
|
146 | 141 | field_id = kwargs.pop('id', field.id) |
|
147 | 142 | kwargs.setdefault('value', field.label.text) |
|
148 | 143 | html = ['<button %s>%s</button>' % (html_params(id=field_id, |
|
149 | 144 | class_=button_cls), |
|
150 | 145 | kwargs['value'],)] |
|
151 | 146 | return HTMLString(''.join(html)) |
|
152 | 147 | |
|
153 | 148 | |
|
154 | 149 | def clean_whitespace(value): |
|
155 | 150 | if value: |
|
156 | 151 | return value.strip() |
|
157 | 152 | return value |
|
158 | 153 | |
|
159 | 154 | |
|
160 | 155 | def found_username_validator(form, field): |
|
161 | 156 | user = User.by_user_name(field.data) |
|
162 | 157 | # sets user to recover in email validator |
|
163 | 158 | form.field_user = user |
|
164 | 159 | if not user: |
|
165 | 160 | raise wtforms.ValidationError('This username does not exist') |
|
166 | 161 | |
|
167 | 162 | |
|
168 | 163 | def found_username_email_validator(form, field): |
|
169 | 164 | user = User.by_email(field.data) |
|
170 | 165 | if not user: |
|
171 | 166 | raise wtforms.ValidationError('Email is incorrect') |
|
172 | 167 | |
|
173 | 168 | |
|
174 | 169 | def unique_username_validator(form, field): |
|
175 | 170 | user = User.by_user_name(field.data) |
|
176 | 171 | if user: |
|
177 | 172 | raise wtforms.ValidationError('This username already exists in system') |
|
178 | 173 | |
|
179 | 174 | |
|
180 | 175 | def unique_groupname_validator(form, field): |
|
181 | 176 | group = Group.by_group_name(field.data) |
|
182 | 177 | mod_group = getattr(form, '_modified_group', None) |
|
183 | 178 | if group and (not mod_group or mod_group.id != group.id): |
|
184 | 179 | raise wtforms.ValidationError( |
|
185 | 180 | 'This group name already exists in system') |
|
186 | 181 | |
|
187 | 182 | |
|
188 | 183 | def unique_email_validator(form, field): |
|
189 | 184 | user = User.by_email(field.data) |
|
190 | 185 | if user: |
|
191 | 186 | raise wtforms.ValidationError('This email already exists in system') |
|
192 | 187 | |
|
193 | 188 | |
|
194 | 189 | def email_validator(form, field): |
|
195 | 190 | validator = formencode.validators.Email() |
|
196 | 191 | try: |
|
197 | 192 | validator.to_python(field.data) |
|
198 | 193 | except formencode.Invalid as e: |
|
199 | 194 | raise wtforms.ValidationError(e) |
|
200 | 195 | |
|
201 | 196 | |
|
202 | 197 | def unique_alert_email_validator(form, field): |
|
203 | 198 | q = DBSession.query(AlertChannel) |
|
204 | 199 | q = q.filter(AlertChannel.channel_name == 'email') |
|
205 | 200 | q = q.filter(AlertChannel.channel_value == field.data) |
|
206 | 201 | email = q.first() |
|
207 | 202 | if email: |
|
208 | 203 | raise wtforms.ValidationError( |
|
209 | 204 | 'This email already exists in alert system') |
|
210 | 205 | |
|
211 | 206 | |
|
212 | 207 | def blocked_email_validator(form, field): |
|
213 | 208 | blocked_emails = [ |
|
214 | 209 | 'goood-mail.org', |
|
215 | 210 | 'shoeonlineblog.com', |
|
216 | 211 | 'louboutinemart.com', |
|
217 | 212 | 'guccibagshere.com', |
|
218 | 213 | 'nikeshoesoutletforsale.com' |
|
219 | 214 | ] |
|
220 | 215 | data = field.data or '' |
|
221 | 216 | domain = data.split('@')[-1] |
|
222 | 217 | if domain in blocked_emails: |
|
223 | 218 | raise wtforms.ValidationError('Don\'t spam') |
|
224 | 219 | |
|
225 | 220 | |
|
226 | 221 | def old_password_validator(form, field): |
|
227 | 222 | if not field.user.check_password(field.data or ''): |
|
228 | 223 | raise wtforms.ValidationError('You need to enter correct password') |
|
229 | 224 | |
|
230 | 225 | |
|
231 | 226 | class UserRegisterForm(ReactorForm): |
|
232 | 227 | user_name = wtforms.StringField( |
|
233 | 228 | _('User Name'), |
|
234 | 229 | filters=[strip_filter], |
|
235 | 230 | validators=[ |
|
236 | 231 | wtforms.validators.Length(min=2, max=30), |
|
237 | 232 | wtforms.validators.Regexp( |
|
238 | 233 | re.compile(r'^[\.\w-]+$', re.UNICODE), |
|
239 | 234 | message="Invalid characters used"), |
|
240 | 235 | unique_username_validator, |
|
241 | 236 | wtforms.validators.DataRequired() |
|
242 | 237 | ]) |
|
243 | 238 | |
|
244 | 239 | user_password = wtforms.PasswordField(_('User Password'), |
|
245 | 240 | filters=[strip_filter], |
|
246 | 241 | validators=[ |
|
247 | 242 | wtforms.validators.Length(min=4), |
|
248 | 243 | wtforms.validators.DataRequired() |
|
249 | 244 | ]) |
|
250 | 245 | |
|
251 | 246 | email = wtforms.StringField(_('Email Address'), |
|
252 | 247 | filters=[strip_filter], |
|
253 | 248 | validators=[email_validator, |
|
254 | 249 | unique_email_validator, |
|
255 | 250 | blocked_email_validator, |
|
256 | 251 | wtforms.validators.DataRequired()]) |
|
257 | 252 | first_name = wtforms.HiddenField(_('First Name')) |
|
258 | 253 | last_name = wtforms.HiddenField(_('Last Name')) |
|
259 | 254 | |
|
260 | 255 | ignore_labels = ['submit'] |
|
261 | 256 | css_classes = {'submit': 'btn btn-primary'} |
|
262 | 257 | |
|
263 | 258 | html_attrs = {'user_name': {'placeholder': 'Your login'}, |
|
264 | 259 | 'user_password': {'placeholder': 'Your password'}, |
|
265 | 260 | 'email': {'placeholder': 'Your email'}} |
|
266 | 261 | |
|
267 | 262 | |
|
268 | 263 | class UserCreateForm(UserRegisterForm): |
|
269 | 264 | status = wtforms.BooleanField('User status', |
|
270 | 265 | false_values=FALSE_VALUES) |
|
271 | 266 | |
|
272 | 267 | |
|
273 | 268 | class UserUpdateForm(UserCreateForm): |
|
274 | 269 | user_name = None |
|
275 | 270 | user_password = wtforms.PasswordField(_('User Password'), |
|
276 | 271 | filters=[strip_filter], |
|
277 | 272 | validators=[ |
|
278 | 273 | wtforms.validators.Length(min=4), |
|
279 | 274 | wtforms.validators.Optional() |
|
280 | 275 | ]) |
|
281 | 276 | email = wtforms.StringField(_('Email Address'), |
|
282 | 277 | filters=[strip_filter], |
|
283 | 278 | validators=[email_validator, |
|
284 | 279 | wtforms.validators.DataRequired()]) |
|
285 | 280 | |
|
286 | 281 | |
|
287 | 282 | class LostPasswordForm(ReactorForm): |
|
288 | 283 | email = wtforms.StringField(_('Email Address'), |
|
289 | 284 | filters=[strip_filter], |
|
290 | 285 | validators=[email_validator, |
|
291 | 286 | found_username_email_validator, |
|
292 | 287 | wtforms.validators.DataRequired()]) |
|
293 | 288 | |
|
294 | 289 | submit = wtforms.SubmitField(_('Reset password')) |
|
295 | 290 | ignore_labels = ['submit'] |
|
296 | 291 | css_classes = {'submit': 'btn btn-primary'} |
|
297 | 292 | |
|
298 | 293 | |
|
299 | 294 | class ChangePasswordForm(ReactorForm): |
|
300 | 295 | old_password = wtforms.PasswordField( |
|
301 | 296 | 'Old Password', |
|
302 | 297 | filters=[strip_filter], |
|
303 | 298 | validators=[old_password_validator, |
|
304 | 299 | wtforms.validators.DataRequired()]) |
|
305 | 300 | |
|
306 | 301 | new_password = wtforms.PasswordField( |
|
307 | 302 | 'New Password', |
|
308 | 303 | filters=[strip_filter], |
|
309 | 304 | validators=[wtforms.validators.Length(min=4), |
|
310 | 305 | wtforms.validators.DataRequired()]) |
|
311 | 306 | new_password_confirm = wtforms.PasswordField( |
|
312 | 307 | 'Confirm Password', |
|
313 | 308 | filters=[strip_filter], |
|
314 | 309 | validators=[wtforms.validators.EqualTo('new_password'), |
|
315 | 310 | wtforms.validators.DataRequired()]) |
|
316 | 311 | submit = wtforms.SubmitField('Change Password') |
|
317 | 312 | ignore_labels = ['submit'] |
|
318 | 313 | css_classes = {'submit': 'btn btn-primary'} |
|
319 | 314 | |
|
320 | 315 | |
|
321 | 316 | class CheckPasswordForm(ReactorForm): |
|
322 | 317 | password = wtforms.PasswordField( |
|
323 | 318 | 'Password', |
|
324 | 319 | filters=[strip_filter], |
|
325 | 320 | validators=[old_password_validator, |
|
326 | 321 | wtforms.validators.DataRequired()]) |
|
327 | 322 | |
|
328 | 323 | |
|
329 | 324 | class NewPasswordForm(ReactorForm): |
|
330 | 325 | new_password = wtforms.PasswordField( |
|
331 | 326 | 'New Password', |
|
332 | 327 | filters=[strip_filter], |
|
333 | 328 | validators=[wtforms.validators.Length(min=4), |
|
334 | 329 | wtforms.validators.DataRequired()]) |
|
335 | 330 | new_password_confirm = wtforms.PasswordField( |
|
336 | 331 | 'Confirm Password', |
|
337 | 332 | filters=[strip_filter], |
|
338 | 333 | validators=[wtforms.validators.EqualTo('new_password'), |
|
339 | 334 | wtforms.validators.DataRequired()]) |
|
340 | 335 | submit = wtforms.SubmitField('Set Password') |
|
341 | 336 | ignore_labels = ['submit'] |
|
342 | 337 | css_classes = {'submit': 'btn btn-primary'} |
|
343 | 338 | |
|
344 | 339 | |
|
345 | 340 | class CORSTextAreaField(wtforms.StringField): |
|
346 | 341 | """ |
|
347 | 342 | This field represents an HTML ``<textarea>`` and can be used to take |
|
348 | 343 | multi-line input. |
|
349 | 344 | """ |
|
350 | 345 | widget = wtforms.widgets.TextArea() |
|
351 | 346 | |
|
352 | 347 | def process_formdata(self, valuelist): |
|
353 | 348 | self.data = [] |
|
354 | 349 | if valuelist: |
|
355 | 350 | data = [x.strip() for x in valuelist[0].split('\n')] |
|
356 | 351 | for d in data: |
|
357 | 352 | if not d: |
|
358 | 353 | continue |
|
359 | 354 | if d.startswith('www.'): |
|
360 | 355 | d = d[4:] |
|
361 | 356 | if data: |
|
362 | 357 | self.data.append(d) |
|
363 | 358 | else: |
|
364 | 359 | self.data = [] |
|
365 | 360 | self.data = '\n'.join(self.data) |
|
366 | 361 | |
|
367 | 362 | |
|
368 | 363 | class ApplicationCreateForm(ReactorForm): |
|
369 | 364 | resource_name = wtforms.StringField( |
|
370 | 365 | _('Application name'), |
|
371 | 366 | filters=[strip_filter], |
|
372 | 367 | validators=[wtforms.validators.Length(min=1), |
|
373 | 368 | wtforms.validators.DataRequired()]) |
|
374 | 369 | |
|
375 | 370 | domains = CORSTextAreaField( |
|
376 | 371 | _('Domain names for CORS headers '), |
|
377 | 372 | validators=[wtforms.validators.Length(min=1), |
|
378 | 373 | wtforms.validators.Optional()], |
|
379 | 374 | description='Required for Javascript error ' |
|
380 | 375 | 'tracking (one line one domain, skip http:// part)') |
|
381 | 376 | |
|
382 | 377 | submit = wtforms.SubmitField(_('Create Application')) |
|
383 | 378 | |
|
384 | 379 | ignore_labels = ['submit'] |
|
385 | 380 | css_classes = {'submit': 'btn btn-primary'} |
|
386 | 381 | html_attrs = {'resource_name': {'placeholder': 'Application Name'}, |
|
387 | 382 | 'uptime_url': {'placeholder': 'http://somedomain.com'}} |
|
388 | 383 | |
|
389 | 384 | |
|
390 | 385 | class ApplicationUpdateForm(ApplicationCreateForm): |
|
391 | 386 | default_grouping = wtforms.SelectField( |
|
392 | 387 | _('Default grouping for errors'), |
|
393 | 388 | choices=[('url_type', 'Error Type + location',), |
|
394 | 389 | ('url_traceback', 'Traceback + location',), |
|
395 | 390 | ('traceback_server', 'Traceback + Server',)], |
|
396 | 391 | default='url_traceback') |
|
397 | 392 | |
|
398 | 393 | error_report_threshold = wtforms.IntegerField( |
|
399 | 394 | _('Alert on error reports'), |
|
400 | 395 | validators=[ |
|
401 | 396 | wtforms.validators.NumberRange(min=1), |
|
402 | 397 | wtforms.validators.DataRequired() |
|
403 | 398 | ], |
|
404 | 399 | description='Application requires to send at least this amount of ' |
|
405 | 400 | 'error reports per minute to open alert' |
|
406 | 401 | ) |
|
407 | 402 | |
|
408 | 403 | slow_report_threshold = wtforms.IntegerField( |
|
409 | 404 | _('Alert on slow reports'), |
|
410 | 405 | validators=[wtforms.validators.NumberRange(min=1), |
|
411 | 406 | wtforms.validators.DataRequired()], |
|
412 | 407 | description='Application requires to send at least this amount of ' |
|
413 | 408 | 'slow reports per minute to open alert') |
|
414 | 409 | |
|
415 | 410 | allow_permanent_storage = wtforms.BooleanField( |
|
416 | 411 | _('Permanent logs'), |
|
417 | 412 | false_values=FALSE_VALUES, |
|
418 | 413 | description=_( |
|
419 | 414 | 'Allow permanent storage of logs in separate DB partitions')) |
|
420 | 415 | |
|
421 | 416 | submit = wtforms.SubmitField(_('Create Application')) |
|
422 | 417 | |
|
423 | 418 | |
|
424 | 419 | class UserSearchSchemaForm(ReactorForm): |
|
425 | 420 | user_name = wtforms.StringField('User Name', |
|
426 | 421 | filters=[strip_filter], ) |
|
427 | 422 | |
|
428 | 423 | submit = wtforms.SubmitField(_('Search User')) |
|
429 | 424 | ignore_labels = ['submit'] |
|
430 | 425 | css_classes = {'submit': 'btn btn-primary'} |
|
431 | 426 | |
|
432 | 427 | '<li class="user_exists"><span></span></li>' |
|
433 | 428 | |
|
434 | 429 | |
|
435 | 430 | class YesNoForm(ReactorForm): |
|
436 | 431 | no = wtforms.SubmitField('No', default='') |
|
437 | 432 | yes = wtforms.SubmitField('Yes', default='') |
|
438 | 433 | ignore_labels = ['submit'] |
|
439 | 434 | css_classes = {'submit': 'btn btn-primary'} |
|
440 | 435 | |
|
441 | 436 | |
|
442 | 437 | status_codes = [('', 'All',), ('500', '500',), ('404', '404',)] |
|
443 | 438 | |
|
444 | 439 | priorities = [('', 'All',)] |
|
445 | 440 | for i in range(1, 11): |
|
446 | 441 | priorities.append((str(i), str(i),)) |
|
447 | 442 | |
|
448 | 443 | report_status_choices = [('', 'All',), |
|
449 | 444 | ('never_reviewed', 'Never revieved',), |
|
450 | 445 | ('reviewed', 'Revieved',), |
|
451 | 446 | ('public', 'Public',), |
|
452 | 447 | ('fixed', 'Fixed',), ] |
|
453 | 448 | |
|
454 | 449 | |
|
455 | 450 | class ReportBrowserForm(ReactorForm): |
|
456 | 451 | applications = wtforms.SelectMultipleField('Applications', |
|
457 | 452 | widget=select_multi_checkbox) |
|
458 | 453 | http_status = wtforms.SelectField('HTTP Status', choices=status_codes) |
|
459 | 454 | priority = wtforms.SelectField('Priority', choices=priorities, default='') |
|
460 | 455 | start_date = wtforms.DateField('Start Date') |
|
461 | 456 | end_date = wtforms.DateField('End Date') |
|
462 | 457 | error = wtforms.StringField('Error') |
|
463 | 458 | url_path = wtforms.StringField('URL Path') |
|
464 | 459 | url_domain = wtforms.StringField('URL Domain') |
|
465 | 460 | report_status = wtforms.SelectField('Report status', |
|
466 | 461 | choices=report_status_choices, |
|
467 | 462 | default='') |
|
468 | 463 | submit = wtforms.SubmitField('<span class="glyphicon glyphicon-search">' |
|
469 | 464 | '</span> Filter results', |
|
470 | 465 | widget=button_widget) |
|
471 | 466 | |
|
472 | 467 | ignore_labels = ['submit'] |
|
473 | 468 | css_classes = {'submit': 'btn btn-primary'} |
|
474 | 469 | |
|
475 | 470 | |
|
476 | 471 | slow_report_status_choices = [('', 'All',), |
|
477 | 472 | ('never_reviewed', 'Never revieved',), |
|
478 | 473 | ('reviewed', 'Revieved',), |
|
479 | 474 | ('public', 'Public',), ] |
|
480 | 475 | |
|
481 | 476 | |
|
482 | 477 | class BulkOperationForm(ReactorForm): |
|
483 | 478 | applications = wtforms.SelectField('Applications') |
|
484 | 479 | start_date = wtforms.DateField( |
|
485 | 480 | 'Start Date', |
|
486 | 481 | default=lambda: datetime.datetime.utcnow() - datetime.timedelta( |
|
487 | 482 | days=90)) |
|
488 | 483 | end_date = wtforms.DateField('End Date') |
|
489 | 484 | confirm = wtforms.BooleanField( |
|
490 | 485 | 'Confirm operation', |
|
491 | 486 | validators=[wtforms.validators.DataRequired()]) |
|
492 | 487 | |
|
493 | 488 | |
|
494 | 489 | class LogBrowserForm(ReactorForm): |
|
495 | 490 | applications = wtforms.SelectMultipleField('Applications', |
|
496 | 491 | widget=select_multi_checkbox) |
|
497 | 492 | start_date = wtforms.DateField('Start Date') |
|
498 | 493 | log_level = wtforms.StringField('Log level') |
|
499 | 494 | message = wtforms.StringField('Message') |
|
500 | 495 | namespace = wtforms.StringField('Namespace') |
|
501 | 496 | submit = wtforms.SubmitField( |
|
502 | 497 | '<span class="glyphicon glyphicon-search"></span> Filter results', |
|
503 | 498 | widget=button_widget) |
|
504 | 499 | ignore_labels = ['submit'] |
|
505 | 500 | css_classes = {'submit': 'btn btn-primary'} |
|
506 | 501 | |
|
507 | 502 | |
|
508 | 503 | class CommentForm(ReactorForm): |
|
509 | 504 | body = wtforms.TextAreaField('Comment', validators=[ |
|
510 | 505 | wtforms.validators.Length(min=1), |
|
511 | 506 | wtforms.validators.DataRequired() |
|
512 | 507 | ]) |
|
513 | 508 | submit = wtforms.SubmitField('Comment', ) |
|
514 | 509 | ignore_labels = ['submit'] |
|
515 | 510 | css_classes = {'submit': 'btn btn-primary'} |
|
516 | 511 | |
|
517 | 512 | |
|
518 | 513 | class EmailChannelCreateForm(ReactorForm): |
|
519 | 514 | email = wtforms.StringField(_('Email Address'), |
|
520 | 515 | filters=[strip_filter], |
|
521 | 516 | validators=[email_validator, |
|
522 | 517 | unique_alert_email_validator, |
|
523 | 518 | wtforms.validators.DataRequired()]) |
|
524 | 519 | submit = wtforms.SubmitField('Add email channel', ) |
|
525 | 520 | ignore_labels = ['submit'] |
|
526 | 521 | css_classes = {'submit': 'btn btn-primary'} |
|
527 | 522 | |
|
528 | 523 | |
|
529 | 524 | def gen_user_profile_form(): |
|
530 | 525 | class UserProfileForm(ReactorForm): |
|
531 | 526 | email = wtforms.StringField( |
|
532 | 527 | _('Email Address'), |
|
533 | 528 | validators=[email_validator, wtforms.validators.DataRequired()]) |
|
534 | 529 | first_name = wtforms.StringField(_('First Name')) |
|
535 | 530 | last_name = wtforms.StringField(_('Last Name')) |
|
536 | 531 | company_name = wtforms.StringField(_('Company Name')) |
|
537 | 532 | company_address = wtforms.TextAreaField(_('Company Address')) |
|
538 | 533 | zip_code = wtforms.StringField(_('ZIP code')) |
|
539 | 534 | city = wtforms.StringField(_('City')) |
|
540 | 535 | notifications = wtforms.BooleanField('Account notifications', |
|
541 | 536 | false_values=FALSE_VALUES) |
|
542 | 537 | submit = wtforms.SubmitField(_('Update Account')) |
|
543 | 538 | ignore_labels = ['submit'] |
|
544 | 539 | css_classes = {'submit': 'btn btn-primary'} |
|
545 | 540 | |
|
546 | 541 | return UserProfileForm |
|
547 | 542 | |
|
548 | 543 | |
|
549 | 544 | class PurgeAppForm(ReactorForm): |
|
550 | 545 | resource_id = wtforms.HiddenField( |
|
551 | 546 | 'App Id', |
|
552 | 547 | validators=[wtforms.validators.DataRequired()]) |
|
553 | 548 | days = wtforms.IntegerField( |
|
554 | 549 | 'Days', |
|
555 | 550 | validators=[wtforms.validators.DataRequired()]) |
|
556 | 551 | password = wtforms.PasswordField( |
|
557 | 552 | 'Admin Password', |
|
558 | 553 | validators=[old_password_validator, wtforms.validators.DataRequired()]) |
|
559 | 554 | submit = wtforms.SubmitField(_('Purge Data')) |
|
560 | 555 | ignore_labels = ['submit'] |
|
561 | 556 | css_classes = {'submit': 'btn btn-primary'} |
|
562 | 557 | |
|
563 | 558 | |
|
564 | 559 | class IntegrationRepoForm(ReactorForm): |
|
565 | 560 | host_name = wtforms.StringField("Service Host", default='') |
|
566 | 561 | user_name = wtforms.StringField( |
|
567 | 562 | "User Name", |
|
568 | 563 | filters=[strip_filter], |
|
569 | 564 | validators=[wtforms.validators.DataRequired(), |
|
570 | 565 | wtforms.validators.Length(min=1)]) |
|
571 | 566 | repo_name = wtforms.StringField( |
|
572 | 567 | "Repo Name", |
|
573 | 568 | filters=[strip_filter], |
|
574 | 569 | validators=[wtforms.validators.DataRequired(), |
|
575 | 570 | wtforms.validators.Length(min=1)]) |
|
576 | 571 | |
|
577 | 572 | |
|
578 | 573 | class IntegrationBitbucketForm(IntegrationRepoForm): |
|
579 | 574 | host_name = wtforms.StringField("Service Host", |
|
580 | 575 | default='https://bitbucket.org') |
|
581 | 576 | |
|
582 | 577 | def validate_user_name(self, field): |
|
583 | 578 | try: |
|
584 | 579 | request = pyramid.threadlocal.get_current_request() |
|
585 | 580 | client = BitbucketIntegration.create_client( |
|
586 | 581 | request, |
|
587 | 582 | self.user_name.data, |
|
588 | 583 | self.repo_name.data) |
|
589 | 584 | client.get_assignees() |
|
590 | 585 | except IntegrationException as e: |
|
591 | 586 | raise wtforms.validators.ValidationError(str(e)) |
|
592 | 587 | |
|
593 | 588 | |
|
594 | 589 | class IntegrationGithubForm(IntegrationRepoForm): |
|
595 | 590 | host_name = wtforms.StringField("Service Host", |
|
596 | 591 | default='https://github.com') |
|
597 | 592 | |
|
598 | 593 | def validate_user_name(self, field): |
|
599 | 594 | try: |
|
600 | 595 | request = pyramid.threadlocal.get_current_request() |
|
601 | 596 | client = GithubIntegration.create_client( |
|
602 | 597 | request, |
|
603 | 598 | self.user_name.data, |
|
604 | 599 | self.repo_name.data) |
|
605 | 600 | client.get_assignees() |
|
606 | 601 | except IntegrationException as e: |
|
607 | 602 | raise wtforms.validators.ValidationError(str(e)) |
|
608 | 603 | raise wtforms.validators.ValidationError(str(e)) |
|
609 | 604 | |
|
610 | 605 | |
|
611 | 606 | def filter_rooms(data): |
|
612 | 607 | if data is not None: |
|
613 | 608 | rooms = data.split(',') |
|
614 | 609 | return ','.join([r.strip() for r in rooms]) |
|
615 | 610 | |
|
616 | 611 | |
|
617 | 612 | class IntegrationCampfireForm(ReactorForm): |
|
618 | 613 | account = wtforms.StringField( |
|
619 | 614 | 'Account', |
|
620 | 615 | filters=[strip_filter], |
|
621 | 616 | validators=[wtforms.validators.DataRequired()]) |
|
622 | 617 | api_token = wtforms.StringField( |
|
623 | 618 | 'Api Token', |
|
624 | 619 | filters=[strip_filter], |
|
625 | 620 | validators=[wtforms.validators.DataRequired()]) |
|
626 | 621 | rooms = wtforms.StringField('Room ID list', filters=[filter_rooms]) |
|
627 | 622 | |
|
628 | 623 | def validate_api_token(self, field): |
|
629 | 624 | try: |
|
630 | 625 | client = CampfireIntegration.create_client(self.api_token.data, |
|
631 | 626 | self.account.data) |
|
632 | 627 | client.get_account() |
|
633 | 628 | except IntegrationException as e: |
|
634 | 629 | raise wtforms.validators.ValidationError(str(e)) |
|
635 | 630 | |
|
636 | 631 | def validate_rooms(self, field): |
|
637 | 632 | if not field.data: |
|
638 | 633 | return |
|
639 | 634 | client = CampfireIntegration.create_client(self.api_token.data, |
|
640 | 635 | self.account.data) |
|
641 | 636 | |
|
642 | 637 | try: |
|
643 | 638 | room_list = [r['id'] for r in client.get_rooms()] |
|
644 | 639 | except IntegrationException as e: |
|
645 | 640 | raise wtforms.validators.ValidationError(str(e)) |
|
646 | 641 | |
|
647 | 642 | rooms = field.data.split(',') |
|
648 | 643 | if len(rooms) > 3: |
|
649 | 644 | msg = 'You can use up to 3 room ids' |
|
650 | 645 | raise wtforms.validators.ValidationError(msg) |
|
651 | 646 | if rooms: |
|
652 | 647 | for room_id in rooms: |
|
653 | 648 | if int(room_id) not in room_list: |
|
654 | 649 | msg = "Room %s doesn't exist" |
|
655 | 650 | raise wtforms.validators.ValidationError(msg % room_id) |
|
656 | 651 | if not room_id.strip().isdigit(): |
|
657 | 652 | msg = 'You must use only integers for room ids' |
|
658 | 653 | raise wtforms.validators.ValidationError(msg) |
|
659 | 654 | |
|
660 | 655 | submit = wtforms.SubmitField(_('Connect to Campfire')) |
|
661 | 656 | ignore_labels = ['submit'] |
|
662 | 657 | css_classes = {'submit': 'btn btn-primary'} |
|
663 | 658 | |
|
664 | 659 | |
|
665 | 660 | def filter_rooms(data): |
|
666 | 661 | if data is not None: |
|
667 | 662 | rooms = data.split(',') |
|
668 | 663 | return ','.join([r.strip() for r in rooms]) |
|
669 | 664 | |
|
670 | 665 | |
|
671 | 666 | class IntegrationHipchatForm(ReactorForm): |
|
672 | 667 | api_token = wtforms.StringField( |
|
673 | 668 | 'Api Token', |
|
674 | 669 | filters=[strip_filter], |
|
675 | 670 | validators=[wtforms.validators.DataRequired()]) |
|
676 | 671 | rooms = wtforms.StringField( |
|
677 | 672 | 'Room ID list', |
|
678 | 673 | filters=[filter_rooms], |
|
679 | 674 | validators=[wtforms.validators.DataRequired()]) |
|
680 | 675 | |
|
681 | 676 | def validate_rooms(self, field): |
|
682 | 677 | if not field.data: |
|
683 | 678 | return |
|
684 | 679 | client = HipchatIntegration.create_client(self.api_token.data) |
|
685 | 680 | rooms = field.data.split(',') |
|
686 | 681 | if len(rooms) > 3: |
|
687 | 682 | msg = 'You can use up to 3 room ids' |
|
688 | 683 | raise wtforms.validators.ValidationError(msg) |
|
689 | 684 | if rooms: |
|
690 | 685 | for room_id in rooms: |
|
691 | 686 | if not room_id.strip().isdigit(): |
|
692 | 687 | msg = 'You must use only integers for room ids' |
|
693 | 688 | raise wtforms.validators.ValidationError(msg) |
|
694 | 689 | try: |
|
695 | 690 | client.send({ |
|
696 | 691 | "message_format": 'text', |
|
697 | 692 | "message": "testing for room existence", |
|
698 | 693 | "from": "AppEnlight", |
|
699 | 694 | "room_id": room_id, |
|
700 | 695 | "color": "green" |
|
701 | 696 | }) |
|
702 | 697 | except IntegrationException as exc: |
|
703 | 698 | msg = 'Room id: %s exception: %s' |
|
704 | 699 | raise wtforms.validators.ValidationError(msg % (room_id, |
|
705 | 700 | exc)) |
|
706 | 701 | |
|
707 | 702 | |
|
708 | 703 | class IntegrationFlowdockForm(ReactorForm): |
|
709 | 704 | api_token = wtforms.StringField('API Token', |
|
710 | 705 | filters=[strip_filter], |
|
711 | 706 | validators=[ |
|
712 | 707 | wtforms.validators.DataRequired() |
|
713 | 708 | ], ) |
|
714 | 709 | |
|
715 | 710 | def validate_api_token(self, field): |
|
716 | 711 | try: |
|
717 | 712 | client = FlowdockIntegration.create_client(self.api_token.data) |
|
718 | 713 | registry = pyramid.threadlocal.get_current_registry() |
|
719 | 714 | payload = { |
|
720 | 715 | "source": registry.settings['mailing.from_name'], |
|
721 | 716 | "from_address": registry.settings['mailing.from_email'], |
|
722 | 717 | "subject": "Integration test", |
|
723 | 718 | "content": "If you can see this it was successful", |
|
724 | 719 | "tags": ["appenlight"], |
|
725 | 720 | "link": registry.settings['mailing.app_url'] |
|
726 | 721 | } |
|
727 | 722 | client.send_to_inbox(payload) |
|
728 | 723 | except IntegrationException as e: |
|
729 | 724 | raise wtforms.validators.ValidationError(str(e)) |
|
730 | 725 | |
|
731 | 726 | |
|
732 | 727 | class IntegrationSlackForm(ReactorForm): |
|
733 | 728 | webhook_url = wtforms.StringField( |
|
734 | 729 | 'Reports webhook', |
|
735 | 730 | filters=[strip_filter], |
|
736 | 731 | validators=[wtforms.validators.DataRequired()]) |
|
737 | 732 | |
|
738 | 733 | def validate_webhook_url(self, field): |
|
739 | 734 | registry = pyramid.threadlocal.get_current_registry() |
|
740 | 735 | client = SlackIntegration.create_client(field.data) |
|
741 | 736 | link = "<%s|%s>" % (registry.settings['mailing.app_url'], |
|
742 | 737 | registry.settings['mailing.from_name']) |
|
743 | 738 | test_data = { |
|
744 | 739 | "username": "AppEnlight", |
|
745 | 740 | "icon_emoji": ":fire:", |
|
746 | 741 | "attachments": [ |
|
747 | 742 | {"fallback": "Testing integration channel: %s" % link, |
|
748 | 743 | "pretext": "Testing integration channel: %s" % link, |
|
749 | 744 | "color": "good", |
|
750 | 745 | "fields": [ |
|
751 | 746 | { |
|
752 | 747 | "title": "Status", |
|
753 | 748 | "value": "Integration is working fine", |
|
754 | 749 | "short": False |
|
755 | 750 | } |
|
756 | 751 | ]} |
|
757 | 752 | ] |
|
758 | 753 | } |
|
759 | 754 | try: |
|
760 | 755 | client.make_request(data=test_data) |
|
761 | 756 | except IntegrationException as exc: |
|
762 | 757 | raise wtforms.validators.ValidationError(str(exc)) |
|
763 | 758 | |
|
764 | 759 | |
|
765 | 760 | class IntegrationWebhooksForm(ReactorForm): |
|
766 | 761 | reports_webhook = wtforms.StringField( |
|
767 | 762 | 'Reports webhook', |
|
768 | 763 | filters=[strip_filter], |
|
769 | 764 | validators=[wtforms.validators.DataRequired()]) |
|
770 | 765 | alerts_webhook = wtforms.StringField( |
|
771 | 766 | 'Alerts webhook', |
|
772 | 767 | filters=[strip_filter], |
|
773 | 768 | validators=[wtforms.validators.DataRequired()]) |
|
774 | 769 | submit = wtforms.SubmitField(_('Setup webhooks')) |
|
775 | 770 | ignore_labels = ['submit'] |
|
776 | 771 | css_classes = {'submit': 'btn btn-primary'} |
|
777 | 772 | |
|
778 | 773 | |
|
779 | 774 | class IntegrationJiraForm(ReactorForm): |
|
780 | 775 | host_name = wtforms.StringField( |
|
781 | 776 | 'Server URL', |
|
782 | 777 | filters=[strip_filter], |
|
783 | 778 | validators=[wtforms.validators.DataRequired()]) |
|
784 | 779 | user_name = wtforms.StringField( |
|
785 | 780 | 'Username', |
|
786 | 781 | filters=[strip_filter], |
|
787 | 782 | validators=[wtforms.validators.DataRequired()]) |
|
788 | 783 | password = wtforms.PasswordField( |
|
789 | 784 | 'Password', |
|
790 | 785 | filters=[strip_filter], |
|
791 | 786 | validators=[wtforms.validators.DataRequired()]) |
|
792 | 787 | project = wtforms.StringField( |
|
793 | 788 | 'Project key', |
|
794 | 789 | filters=[uppercase_filter, strip_filter], |
|
795 | 790 | validators=[wtforms.validators.DataRequired()]) |
|
796 | 791 | |
|
797 | 792 | def validate_project(self, field): |
|
798 | 793 | if not field.data: |
|
799 | 794 | return |
|
800 | 795 | try: |
|
801 | 796 | client = JiraClient(self.user_name.data, |
|
802 | 797 | self.password.data, |
|
803 | 798 | self.host_name.data, |
|
804 | 799 | self.project.data) |
|
805 | 800 | except Exception as exc: |
|
806 | 801 | raise wtforms.validators.ValidationError(str(exc)) |
|
807 | 802 | |
|
808 | 803 | room_list = [r.key.upper() for r in client.get_projects()] |
|
809 | 804 | if field.data.upper() not in room_list: |
|
810 | 805 | msg = "Project %s doesn\t exist in your Jira Instance" |
|
811 | 806 | raise wtforms.validators.ValidationError(msg % field.data) |
|
812 | 807 | |
|
813 | 808 | |
|
814 | 809 | def get_deletion_form(resource): |
|
815 | 810 | class F(ReactorForm): |
|
816 | 811 | application_name = wtforms.StringField( |
|
817 | 812 | 'Application Name', |
|
818 | 813 | filters=[strip_filter], |
|
819 | 814 | validators=[wtforms.validators.AnyOf([resource.resource_name])]) |
|
820 | 815 | resource_id = wtforms.HiddenField(default=resource.resource_id) |
|
821 | 816 | submit = wtforms.SubmitField(_('Delete my application')) |
|
822 | 817 | ignore_labels = ['submit'] |
|
823 | 818 | css_classes = {'submit': 'btn btn-danger'} |
|
824 | 819 | |
|
825 | 820 | return F |
|
826 | 821 | |
|
827 | 822 | |
|
828 | 823 | class ChangeApplicationOwnerForm(ReactorForm): |
|
829 | 824 | password = wtforms.PasswordField( |
|
830 | 825 | 'Password', |
|
831 | 826 | filters=[strip_filter], |
|
832 | 827 | validators=[old_password_validator, |
|
833 | 828 | wtforms.validators.DataRequired()]) |
|
834 | 829 | |
|
835 | 830 | user_name = wtforms.StringField( |
|
836 | 831 | 'New owners username', |
|
837 | 832 | filters=[strip_filter], |
|
838 | 833 | validators=[found_username_validator, |
|
839 | 834 | wtforms.validators.DataRequired()]) |
|
840 | 835 | submit = wtforms.SubmitField(_('Transfer ownership of application')) |
|
841 | 836 | ignore_labels = ['submit'] |
|
842 | 837 | css_classes = {'submit': 'btn btn-danger'} |
|
843 | 838 | |
|
844 | 839 | |
|
845 | 840 | def default_filename(): |
|
846 | 841 | return 'Invoice %s' % datetime.datetime.utcnow().strftime('%Y/%m') |
|
847 | 842 | |
|
848 | 843 | |
|
849 | 844 | class FileUploadForm(ReactorForm): |
|
850 | 845 | title = wtforms.StringField('File Title', |
|
851 | 846 | default=default_filename, |
|
852 | 847 | validators=[wtforms.validators.DataRequired()]) |
|
853 | 848 | file = wtforms.FileField('File') |
|
854 | 849 | |
|
855 | 850 | def validate_file(self, field): |
|
856 | 851 | if not hasattr(field.data, 'file'): |
|
857 | 852 | raise wtforms.ValidationError('File is missing') |
|
858 | 853 | |
|
859 | 854 | submit = wtforms.SubmitField(_('Upload')) |
|
860 | 855 | |
|
861 | 856 | |
|
862 | 857 | def get_partition_deletion_form(es_indices, pg_indices): |
|
863 | 858 | class F(ReactorForm): |
|
864 | 859 | es_index = wtforms.SelectMultipleField('Elasticsearch', |
|
865 | 860 | choices=[(ix, '') for ix in |
|
866 | 861 | es_indices]) |
|
867 | 862 | pg_index = wtforms.SelectMultipleField('pg', |
|
868 | 863 | choices=[(ix, '') for ix in |
|
869 | 864 | pg_indices]) |
|
870 | 865 | confirm = wtforms.TextField('Confirm', |
|
871 | 866 | filters=[uppercase_filter, strip_filter], |
|
872 | 867 | validators=[ |
|
873 | 868 | wtforms.validators.AnyOf(['CONFIRM']), |
|
874 | 869 | wtforms.validators.DataRequired()]) |
|
875 | 870 | ignore_labels = ['submit'] |
|
876 | 871 | css_classes = {'submit': 'btn btn-danger'} |
|
877 | 872 | |
|
878 | 873 | return F |
|
879 | 874 | |
|
880 | 875 | |
|
881 | 876 | class GroupCreateForm(ReactorForm): |
|
882 | 877 | group_name = wtforms.StringField( |
|
883 | 878 | _('Group Name'), |
|
884 | 879 | filters=[strip_filter], |
|
885 | 880 | validators=[ |
|
886 | 881 | wtforms.validators.Length(min=2, max=50), |
|
887 | 882 | unique_groupname_validator, |
|
888 | 883 | wtforms.validators.DataRequired() |
|
889 | 884 | ]) |
|
890 | 885 | description = wtforms.StringField(_('Group description')) |
|
891 | 886 | |
|
892 | 887 | |
|
893 | 888 | time_choices = [(k, v['label'],) for k, v in h.time_deltas.items()] |
|
894 | 889 | |
|
895 | 890 | |
|
896 | 891 | class AuthTokenCreateForm(ReactorForm): |
|
897 | 892 | description = wtforms.StringField(_('Token description')) |
|
898 | 893 | expires = wtforms.SelectField('Expires', |
|
899 | 894 | coerce=lambda x: x, |
|
900 | 895 | choices=time_choices, |
|
901 | 896 | validators=[wtforms.validators.Optional()]) |
@@ -1,55 +1,50 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | """Miscellaneous support packages for {{project}}. |
|
23 | 18 | """ |
|
24 | 19 | import random |
|
25 | 20 | import string |
|
26 | 21 | import importlib |
|
27 | 22 | |
|
28 | 23 | from appenlight_client.exceptions import get_current_traceback |
|
29 | 24 | |
|
30 | 25 | |
|
31 | 26 | def generate_random_string(chars=10): |
|
32 | 27 | return ''.join(random.sample(string.ascii_letters * 2 + string.digits, |
|
33 | 28 | chars)) |
|
34 | 29 | |
|
35 | 30 | |
|
36 | 31 | def to_integer_safe(input): |
|
37 | 32 | try: |
|
38 | 33 | return int(input) |
|
39 | 34 | except (TypeError, ValueError,): |
|
40 | 35 | return None |
|
41 | 36 | |
|
42 | 37 | |
|
43 | 38 | def print_traceback(log): |
|
44 | 39 | traceback = get_current_traceback(skip=1, show_hidden_frames=True, |
|
45 | 40 | ignore_system_exceptions=True) |
|
46 | 41 | exception_text = traceback.exception |
|
47 | 42 | log.error(exception_text) |
|
48 | 43 | log.error(traceback.plaintext) |
|
49 | 44 | del traceback |
|
50 | 45 | |
|
51 | 46 | |
|
52 | 47 | def get_callable(import_string): |
|
53 | 48 | import_module, indexer_callable = import_string.split(':') |
|
54 | 49 | return getattr(importlib.import_module(import_module), |
|
55 | 50 | indexer_callable) |
@@ -1,86 +1,81 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import datetime |
|
23 | 18 | import logging |
|
24 | 19 | |
|
25 | 20 | from pyramid.httpexceptions import HTTPForbidden, HTTPTooManyRequests |
|
26 | 21 | |
|
27 | 22 | from appenlight.models import Datastores |
|
28 | 23 | from appenlight.models.services.config import ConfigService |
|
29 | 24 | from appenlight.lib.redis_keys import REDIS_KEYS |
|
30 | 25 | |
|
31 | 26 | log = logging.getLogger(__name__) |
|
32 | 27 | |
|
33 | 28 | |
|
34 | 29 | def rate_limiting(request, resource, section, to_increment=1): |
|
35 | 30 | tsample = datetime.datetime.utcnow().replace(second=0, microsecond=0) |
|
36 | 31 | key = REDIS_KEYS['rate_limits'][section].format(tsample, |
|
37 | 32 | resource.resource_id) |
|
38 | 33 | redis_pipeline = request.registry.redis_conn.pipeline() |
|
39 | 34 | redis_pipeline.incr(key, to_increment) |
|
40 | 35 | redis_pipeline.expire(key, 3600 * 24) |
|
41 | 36 | results = redis_pipeline.execute() |
|
42 | 37 | current_count = results[0] |
|
43 | 38 | config = ConfigService.by_key_and_section(section, 'global') |
|
44 | 39 | limit = config.value if config else 1000 |
|
45 | 40 | if current_count > int(limit): |
|
46 | 41 | log.info('RATE LIMITING: {}: {}, {}'.format( |
|
47 | 42 | section, resource, current_count)) |
|
48 | 43 | abort_msg = 'Rate limits are in effect for this application' |
|
49 | 44 | raise HTTPTooManyRequests(abort_msg, |
|
50 | 45 | headers={'X-AppEnlight': abort_msg}) |
|
51 | 46 | |
|
52 | 47 | |
|
53 | 48 | def check_cors(request, application, should_return=True): |
|
54 | 49 | """ |
|
55 | 50 | Performs a check and validation if request comes from authorized domain for |
|
56 | 51 | application, otherwise return 403 |
|
57 | 52 | """ |
|
58 | 53 | origin_found = False |
|
59 | 54 | origin = request.headers.get('Origin') |
|
60 | 55 | if should_return: |
|
61 | 56 | log.info('CORS for %s' % origin) |
|
62 | 57 | if not origin: |
|
63 | 58 | return False |
|
64 | 59 | for domain in application.domains.split('\n'): |
|
65 | 60 | if domain in origin: |
|
66 | 61 | origin_found = True |
|
67 | 62 | if origin_found: |
|
68 | 63 | request.response.headers.add('Access-Control-Allow-Origin', origin) |
|
69 | 64 | request.response.headers.add('XDomainRequestAllowed', '1') |
|
70 | 65 | request.response.headers.add('Access-Control-Allow-Methods', |
|
71 | 66 | 'GET, POST, OPTIONS') |
|
72 | 67 | request.response.headers.add('Access-Control-Allow-Headers', |
|
73 | 68 | 'Accept-Encoding, Accept-Language, ' |
|
74 | 69 | 'Content-Type, ' |
|
75 | 70 | 'Depth, User-Agent, X-File-Size, ' |
|
76 | 71 | 'X-Requested-With, If-Modified-Since, ' |
|
77 | 72 | 'X-File-Name, ' |
|
78 | 73 | 'Cache-Control, Host, Pragma, Accept, ' |
|
79 | 74 | 'Origin, Connection, ' |
|
80 | 75 | 'Referer, Cookie, ' |
|
81 | 76 | 'X-appenlight-public-api-key, ' |
|
82 | 77 | 'x-appenlight-public-api-key') |
|
83 | 78 | request.response.headers.add('Access-Control-Max-Age', '86400') |
|
84 | 79 | return request.response |
|
85 | 80 | else: |
|
86 | 81 | return HTTPForbidden() |
@@ -1,188 +1,183 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import copy |
|
23 | 18 | import hashlib |
|
24 | 19 | import inspect |
|
25 | 20 | |
|
26 | 21 | from dogpile.cache import make_region, compat |
|
27 | 22 | |
|
28 | 23 | regions = None |
|
29 | 24 | |
|
30 | 25 | |
|
31 | 26 | def key_mangler(key): |
|
32 | 27 | return "appenlight:dogpile:{}".format(key) |
|
33 | 28 | |
|
34 | 29 | |
|
35 | 30 | def hashgen(namespace, fn, to_str=compat.string_type): |
|
36 | 31 | """Return a function that generates a string |
|
37 | 32 | key, based on a given function as well as |
|
38 | 33 | arguments to the returned function itself. |
|
39 | 34 | |
|
40 | 35 | This is used by :meth:`.CacheRegion.cache_on_arguments` |
|
41 | 36 | to generate a cache key from a decorated function. |
|
42 | 37 | |
|
43 | 38 | It can be replaced using the ``function_key_generator`` |
|
44 | 39 | argument passed to :func:`.make_region`. |
|
45 | 40 | |
|
46 | 41 | """ |
|
47 | 42 | |
|
48 | 43 | if namespace is None: |
|
49 | 44 | namespace = '%s:%s' % (fn.__module__, fn.__name__) |
|
50 | 45 | else: |
|
51 | 46 | namespace = '%s:%s|%s' % (fn.__module__, fn.__name__, namespace) |
|
52 | 47 | |
|
53 | 48 | args = inspect.getargspec(fn) |
|
54 | 49 | has_self = args[0] and args[0][0] in ('self', 'cls') |
|
55 | 50 | |
|
56 | 51 | def generate_key(*args, **kw): |
|
57 | 52 | if kw: |
|
58 | 53 | raise ValueError( |
|
59 | 54 | "dogpile.cache's default key creation " |
|
60 | 55 | "function does not accept keyword arguments.") |
|
61 | 56 | if has_self: |
|
62 | 57 | args = args[1:] |
|
63 | 58 | |
|
64 | 59 | return namespace + "|" + hashlib.sha1( |
|
65 | 60 | " ".join(map(to_str, args)).encode('utf8')).hexdigest() |
|
66 | 61 | |
|
67 | 62 | return generate_key |
|
68 | 63 | |
|
69 | 64 | |
|
70 | 65 | class CacheRegions(object): |
|
71 | 66 | def __init__(self, settings): |
|
72 | 67 | config_redis = {"arguments": settings} |
|
73 | 68 | |
|
74 | 69 | self.redis_min_1 = make_region( |
|
75 | 70 | function_key_generator=hashgen, |
|
76 | 71 | key_mangler=key_mangler).configure( |
|
77 | 72 | "dogpile.cache.redis", |
|
78 | 73 | expiration_time=60, |
|
79 | 74 | **copy.deepcopy(config_redis)) |
|
80 | 75 | self.redis_min_5 = make_region( |
|
81 | 76 | function_key_generator=hashgen, |
|
82 | 77 | key_mangler=key_mangler).configure( |
|
83 | 78 | "dogpile.cache.redis", |
|
84 | 79 | expiration_time=300, |
|
85 | 80 | **copy.deepcopy(config_redis)) |
|
86 | 81 | |
|
87 | 82 | self.redis_min_10 = make_region( |
|
88 | 83 | function_key_generator=hashgen, |
|
89 | 84 | key_mangler=key_mangler).configure( |
|
90 | 85 | "dogpile.cache.redis", |
|
91 | 86 | expiration_time=60, |
|
92 | 87 | **copy.deepcopy(config_redis)) |
|
93 | 88 | |
|
94 | 89 | self.redis_min_60 = make_region( |
|
95 | 90 | function_key_generator=hashgen, |
|
96 | 91 | key_mangler=key_mangler).configure( |
|
97 | 92 | "dogpile.cache.redis", |
|
98 | 93 | expiration_time=3600, |
|
99 | 94 | **copy.deepcopy(config_redis)) |
|
100 | 95 | |
|
101 | 96 | self.redis_sec_1 = make_region( |
|
102 | 97 | function_key_generator=hashgen, |
|
103 | 98 | key_mangler=key_mangler).configure( |
|
104 | 99 | "dogpile.cache.redis", |
|
105 | 100 | expiration_time=1, |
|
106 | 101 | **copy.deepcopy(config_redis)) |
|
107 | 102 | |
|
108 | 103 | self.redis_sec_5 = make_region( |
|
109 | 104 | function_key_generator=hashgen, |
|
110 | 105 | key_mangler=key_mangler).configure( |
|
111 | 106 | "dogpile.cache.redis", |
|
112 | 107 | expiration_time=5, |
|
113 | 108 | **copy.deepcopy(config_redis)) |
|
114 | 109 | |
|
115 | 110 | self.redis_sec_30 = make_region( |
|
116 | 111 | function_key_generator=hashgen, |
|
117 | 112 | key_mangler=key_mangler).configure( |
|
118 | 113 | "dogpile.cache.redis", |
|
119 | 114 | expiration_time=30, |
|
120 | 115 | **copy.deepcopy(config_redis)) |
|
121 | 116 | |
|
122 | 117 | self.redis_day_1 = make_region( |
|
123 | 118 | function_key_generator=hashgen, |
|
124 | 119 | key_mangler=key_mangler).configure( |
|
125 | 120 | "dogpile.cache.redis", |
|
126 | 121 | expiration_time=86400, |
|
127 | 122 | **copy.deepcopy(config_redis)) |
|
128 | 123 | |
|
129 | 124 | self.redis_day_7 = make_region( |
|
130 | 125 | function_key_generator=hashgen, |
|
131 | 126 | key_mangler=key_mangler).configure( |
|
132 | 127 | "dogpile.cache.redis", |
|
133 | 128 | expiration_time=86400 * 7, |
|
134 | 129 | **copy.deepcopy(config_redis)) |
|
135 | 130 | |
|
136 | 131 | self.redis_day_30 = make_region( |
|
137 | 132 | function_key_generator=hashgen, |
|
138 | 133 | key_mangler=key_mangler).configure( |
|
139 | 134 | "dogpile.cache.redis", |
|
140 | 135 | expiration_time=86400 * 30, |
|
141 | 136 | **copy.deepcopy(config_redis)) |
|
142 | 137 | |
|
143 | 138 | self.memory_day_1 = make_region( |
|
144 | 139 | function_key_generator=hashgen, |
|
145 | 140 | key_mangler=key_mangler).configure( |
|
146 | 141 | "dogpile.cache.memory", |
|
147 | 142 | expiration_time=86400, |
|
148 | 143 | **copy.deepcopy(config_redis)) |
|
149 | 144 | |
|
150 | 145 | self.memory_sec_1 = make_region( |
|
151 | 146 | function_key_generator=hashgen, |
|
152 | 147 | key_mangler=key_mangler).configure( |
|
153 | 148 | "dogpile.cache.memory", |
|
154 | 149 | expiration_time=1) |
|
155 | 150 | |
|
156 | 151 | self.memory_sec_5 = make_region( |
|
157 | 152 | function_key_generator=hashgen, |
|
158 | 153 | key_mangler=key_mangler).configure( |
|
159 | 154 | "dogpile.cache.memory", |
|
160 | 155 | expiration_time=5) |
|
161 | 156 | |
|
162 | 157 | self.memory_min_1 = make_region( |
|
163 | 158 | function_key_generator=hashgen, |
|
164 | 159 | key_mangler=key_mangler).configure( |
|
165 | 160 | "dogpile.cache.memory", |
|
166 | 161 | expiration_time=60) |
|
167 | 162 | |
|
168 | 163 | self.memory_min_5 = make_region( |
|
169 | 164 | function_key_generator=hashgen, |
|
170 | 165 | key_mangler=key_mangler).configure( |
|
171 | 166 | "dogpile.cache.memory", |
|
172 | 167 | expiration_time=300) |
|
173 | 168 | |
|
174 | 169 | self.memory_min_10 = make_region( |
|
175 | 170 | function_key_generator=hashgen, |
|
176 | 171 | key_mangler=key_mangler).configure( |
|
177 | 172 | "dogpile.cache.memory", |
|
178 | 173 | expiration_time=600) |
|
179 | 174 | |
|
180 | 175 | self.memory_min_60 = make_region( |
|
181 | 176 | function_key_generator=hashgen, |
|
182 | 177 | key_mangler=key_mangler).configure( |
|
183 | 178 | "dogpile.cache.memory", |
|
184 | 179 | expiration_time=3600) |
|
185 | 180 | |
|
186 | 181 | |
|
187 | 182 | def get_region(region): |
|
188 | 183 | return getattr(regions, region) |
@@ -1,63 +1,58 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | # this gets set on runtime |
|
23 | 18 | from cryptography.fernet import Fernet |
|
24 | 19 | |
|
25 | 20 | ENCRYPTION_SECRET = None |
|
26 | 21 | |
|
27 | 22 | |
|
28 | 23 | def encrypt_fernet(value): |
|
29 | 24 | # avoid double encryption |
|
30 | 25 | # not sure if this is needed but it won't hurt too much to have this |
|
31 | 26 | if value.startswith('enc$fernet$'): |
|
32 | 27 | return value |
|
33 | 28 | f = Fernet(ENCRYPTION_SECRET) |
|
34 | 29 | return 'enc$fernet${}'.format(f.encrypt(value.encode('utf8')).decode('utf8')) |
|
35 | 30 | |
|
36 | 31 | |
|
37 | 32 | def decrypt_fernet(value): |
|
38 | 33 | parts = value.split('$', 3) |
|
39 | 34 | if not len(parts) == 3: |
|
40 | 35 | # not encrypted values |
|
41 | 36 | return value |
|
42 | 37 | else: |
|
43 | 38 | f = Fernet(ENCRYPTION_SECRET) |
|
44 | 39 | decrypted_data = f.decrypt(parts[2].encode('utf8')).decode('utf8') |
|
45 | 40 | return decrypted_data |
|
46 | 41 | |
|
47 | 42 | |
|
48 | 43 | def encrypt_dictionary_keys(_dict, exclude_keys=None): |
|
49 | 44 | if not exclude_keys: |
|
50 | 45 | exclude_keys = [] |
|
51 | 46 | keys = [k for k in _dict.keys() if k not in exclude_keys] |
|
52 | 47 | for k in keys: |
|
53 | 48 | _dict[k] = encrypt_fernet(_dict[k]) |
|
54 | 49 | return _dict |
|
55 | 50 | |
|
56 | 51 | |
|
57 | 52 | def decrypt_dictionary_keys(_dict, exclude_keys=None): |
|
58 | 53 | if not exclude_keys: |
|
59 | 54 | exclude_keys = [] |
|
60 | 55 | keys = [k for k in _dict.keys() if k not in exclude_keys] |
|
61 | 56 | for k in keys: |
|
62 | 57 | _dict[k] = decrypt_fernet(_dict[k]) |
|
63 | 58 | return _dict |
@@ -1,93 +1,88 b'' | |||
|
1 | 1 | import collections |
|
2 | 2 | # -*- coding: utf-8 -*- |
|
3 | 3 | |
|
4 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
4 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
5 | 5 | # |
|
6 | # This program is free software: you can redistribute it and/or modify | |
|
7 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
8 | # (only), as published by the Free Software Foundation. | |
|
6 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
7 | # you may not use this file except in compliance with the License. | |
|
8 | # You may obtain a copy of the License at | |
|
9 | 9 | # |
|
10 | # This program is distributed in the hope that it will be useful, | |
|
11 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
12 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
13 | # GNU General Public License for more details. | |
|
10 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
14 | 11 | # |
|
15 | # You should have received a copy of the GNU Affero General Public License | |
|
16 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
17 | # | |
|
18 | # This program is dual-licensed. If you wish to learn more about the | |
|
19 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
20 | # services, and proprietary license terms, please see | |
|
21 | # https://rhodecode.com/licenses/ | |
|
12 | # Unless required by applicable law or agreed to in writing, software | |
|
13 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
14 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
15 | # See the License for the specific language governing permissions and | |
|
16 | # limitations under the License. | |
|
22 | 17 | |
|
23 | 18 | |
|
24 | 19 | class StupidEnum(object): |
|
25 | 20 | @classmethod |
|
26 | 21 | def set_inverse(cls): |
|
27 | 22 | cls._inverse_values = dict( |
|
28 | 23 | (y, x) for x, y in vars(cls).items() if |
|
29 | 24 | not x.startswith('_') and not callable(y) |
|
30 | 25 | ) |
|
31 | 26 | |
|
32 | 27 | @classmethod |
|
33 | 28 | def key_from_value(cls, value): |
|
34 | 29 | if not hasattr(cls, '_inverse_values'): |
|
35 | 30 | cls.set_inverse() |
|
36 | 31 | return cls._inverse_values.get(value) |
|
37 | 32 | |
|
38 | 33 | |
|
39 | 34 | class ReportType(StupidEnum): |
|
40 | 35 | unknown = 0 |
|
41 | 36 | error = 1 |
|
42 | 37 | not_found = 2 |
|
43 | 38 | slow = 3 |
|
44 | 39 | |
|
45 | 40 | |
|
46 | 41 | class Language(StupidEnum): |
|
47 | 42 | unknown = 0 |
|
48 | 43 | python = 1 |
|
49 | 44 | javascript = 2 |
|
50 | 45 | java = 3 |
|
51 | 46 | objectivec = 4 |
|
52 | 47 | swift = 5 |
|
53 | 48 | cpp = 6 |
|
54 | 49 | basic = 7 |
|
55 | 50 | csharp = 8 |
|
56 | 51 | php = 9 |
|
57 | 52 | perl = 10 |
|
58 | 53 | vb = 11 |
|
59 | 54 | vbnet = 12 |
|
60 | 55 | ruby = 13 |
|
61 | 56 | fsharp = 14 |
|
62 | 57 | actionscript = 15 |
|
63 | 58 | go = 16 |
|
64 | 59 | scala = 17 |
|
65 | 60 | haskell = 18 |
|
66 | 61 | erlang = 19 |
|
67 | 62 | haxe = 20 |
|
68 | 63 | scheme = 21 |
|
69 | 64 | |
|
70 | 65 | |
|
71 | 66 | class LogLevel(StupidEnum): |
|
72 | 67 | UNKNOWN = 0 |
|
73 | 68 | DEBUG = 2 |
|
74 | 69 | TRACE = 4 |
|
75 | 70 | INFO = 6 |
|
76 | 71 | WARNING = 8 |
|
77 | 72 | ERROR = 10 |
|
78 | 73 | CRITICAL = 12 |
|
79 | 74 | FATAL = 14 |
|
80 | 75 | |
|
81 | 76 | |
|
82 | 77 | class LogLevelPython(StupidEnum): |
|
83 | 78 | CRITICAL = 50 |
|
84 | 79 | ERROR = 40 |
|
85 | 80 | WARNING = 30 |
|
86 | 81 | INFO = 20 |
|
87 | 82 | DEBUG = 10 |
|
88 | 83 | NOTSET = 0 |
|
89 | 84 | |
|
90 | 85 | |
|
91 | 86 | class ParsedSentryEventType(StupidEnum): |
|
92 | 87 | ERROR_REPORT = 1 |
|
93 | 88 | LOG = 2 |
@@ -1,153 +1,148 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | """ |
|
23 | 18 | ex-json borrowed from Marcin Kuzminski |
|
24 | 19 | |
|
25 | 20 | source: https://secure.rhodecode.org/ext-json |
|
26 | 21 | |
|
27 | 22 | """ |
|
28 | 23 | import datetime |
|
29 | 24 | import functools |
|
30 | 25 | import decimal |
|
31 | 26 | import imp |
|
32 | 27 | |
|
33 | 28 | __all__ = ['json', 'simplejson', 'stdlibjson'] |
|
34 | 29 | |
|
35 | 30 | |
|
36 | 31 | def _is_aware(value): |
|
37 | 32 | """ |
|
38 | 33 | Determines if a given datetime.time is aware. |
|
39 | 34 | |
|
40 | 35 | The logic is described in Python's docs: |
|
41 | 36 | http://docs.python.org/library/datetime.html#datetime.tzinfo |
|
42 | 37 | """ |
|
43 | 38 | return (value.tzinfo is not None |
|
44 | 39 | and value.tzinfo.utcoffset(value) is not None) |
|
45 | 40 | |
|
46 | 41 | |
|
47 | 42 | def _obj_dump(obj): |
|
48 | 43 | """ |
|
49 | 44 | Custom function for dumping objects to JSON, if obj has __json__ attribute |
|
50 | 45 | or method defined it will be used for serialization |
|
51 | 46 | |
|
52 | 47 | :param obj: |
|
53 | 48 | """ |
|
54 | 49 | |
|
55 | 50 | if isinstance(obj, complex): |
|
56 | 51 | return [obj.real, obj.imag] |
|
57 | 52 | # See "Date Time String Format" in the ECMA-262 specification. |
|
58 | 53 | # some code borrowed from django 1.4 |
|
59 | 54 | elif isinstance(obj, datetime.datetime): |
|
60 | 55 | r = obj.isoformat() |
|
61 | 56 | # if obj.microsecond: |
|
62 | 57 | # r = r[:23] + r[26:] |
|
63 | 58 | if r.endswith('+00:00'): |
|
64 | 59 | r = r[:-6] + 'Z' |
|
65 | 60 | return r |
|
66 | 61 | elif isinstance(obj, datetime.date): |
|
67 | 62 | return obj.isoformat() |
|
68 | 63 | elif isinstance(obj, decimal.Decimal): |
|
69 | 64 | return str(obj) |
|
70 | 65 | elif isinstance(obj, datetime.time): |
|
71 | 66 | if _is_aware(obj): |
|
72 | 67 | raise ValueError("JSON can't represent timezone-aware times.") |
|
73 | 68 | r = obj.isoformat() |
|
74 | 69 | if obj.microsecond: |
|
75 | 70 | r = r[:12] |
|
76 | 71 | return r |
|
77 | 72 | elif isinstance(obj, set): |
|
78 | 73 | return list(obj) |
|
79 | 74 | elif hasattr(obj, '__json__'): |
|
80 | 75 | if callable(obj.__json__): |
|
81 | 76 | return obj.__json__() |
|
82 | 77 | else: |
|
83 | 78 | return obj.__json__ |
|
84 | 79 | else: |
|
85 | 80 | raise NotImplementedError |
|
86 | 81 | |
|
87 | 82 | |
|
88 | 83 | # Import simplejson |
|
89 | 84 | try: |
|
90 | 85 | # import simplejson initially |
|
91 | 86 | _sj = imp.load_module('_sj', *imp.find_module('simplejson')) |
|
92 | 87 | |
|
93 | 88 | |
|
94 | 89 | def extended_encode(obj): |
|
95 | 90 | try: |
|
96 | 91 | return _obj_dump(obj) |
|
97 | 92 | except NotImplementedError: |
|
98 | 93 | pass |
|
99 | 94 | raise TypeError("%r is not JSON serializable" % (obj,)) |
|
100 | 95 | |
|
101 | 96 | |
|
102 | 97 | # we handle decimals our own it makes unified behavior of json vs |
|
103 | 98 | # simplejson |
|
104 | 99 | sj_version = [int(x) for x in _sj.__version__.split('.')] |
|
105 | 100 | major, minor = sj_version[0], sj_version[1] |
|
106 | 101 | if major < 2 or (major == 2 and minor < 1): |
|
107 | 102 | # simplejson < 2.1 doesnt support use_decimal |
|
108 | 103 | _sj.dumps = functools.partial( |
|
109 | 104 | _sj.dumps, default=extended_encode) |
|
110 | 105 | _sj.dump = functools.partial( |
|
111 | 106 | _sj.dump, default=extended_encode) |
|
112 | 107 | else: |
|
113 | 108 | _sj.dumps = functools.partial( |
|
114 | 109 | _sj.dumps, default=extended_encode, use_decimal=False) |
|
115 | 110 | _sj.dump = functools.partial( |
|
116 | 111 | _sj.dump, default=extended_encode, use_decimal=False) |
|
117 | 112 | simplejson = _sj |
|
118 | 113 | |
|
119 | 114 | except ImportError: |
|
120 | 115 | # no simplejson set it to None |
|
121 | 116 | simplejson = None |
|
122 | 117 | |
|
123 | 118 | try: |
|
124 | 119 | # simplejson not found try out regular json module |
|
125 | 120 | _json = imp.load_module('_json', *imp.find_module('json')) |
|
126 | 121 | |
|
127 | 122 | |
|
128 | 123 | # extended JSON encoder for json |
|
129 | 124 | class ExtendedEncoder(_json.JSONEncoder): |
|
130 | 125 | def default(self, obj): |
|
131 | 126 | try: |
|
132 | 127 | return _obj_dump(obj) |
|
133 | 128 | except NotImplementedError: |
|
134 | 129 | pass |
|
135 | 130 | raise TypeError("%r is not JSON serializable" % (obj,)) |
|
136 | 131 | |
|
137 | 132 | |
|
138 | 133 | # monkey-patch JSON encoder to use extended version |
|
139 | 134 | _json.dumps = functools.partial(_json.dumps, cls=ExtendedEncoder) |
|
140 | 135 | _json.dump = functools.partial(_json.dump, cls=ExtendedEncoder) |
|
141 | 136 | |
|
142 | 137 | except ImportError: |
|
143 | 138 | json = None |
|
144 | 139 | |
|
145 | 140 | stdlibjson = _json |
|
146 | 141 | |
|
147 | 142 | # set all available json modules |
|
148 | 143 | if simplejson: |
|
149 | 144 | json = _sj |
|
150 | 145 | elif _json: |
|
151 | 146 | json = _json |
|
152 | 147 | else: |
|
153 | 148 | raise ImportError('Could not find any json modules') |
@@ -1,124 +1,119 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | """ |
|
23 | 18 | Helper functions |
|
24 | 19 | """ |
|
25 | 20 | import copy |
|
26 | 21 | import datetime |
|
27 | 22 | |
|
28 | 23 | from collections import namedtuple, OrderedDict |
|
29 | 24 | |
|
30 | 25 | _ = lambda x: x |
|
31 | 26 | |
|
32 | 27 | time_deltas = OrderedDict() |
|
33 | 28 | |
|
34 | 29 | time_deltas['1m'] = {'delta': datetime.timedelta(minutes=1), |
|
35 | 30 | 'label': '1 minute', 'minutes': 1} |
|
36 | 31 | |
|
37 | 32 | time_deltas['5m'] = {'delta': datetime.timedelta(minutes=5), |
|
38 | 33 | 'label': '5 minutes', 'minutes': 5} |
|
39 | 34 | time_deltas['30m'] = {'delta': datetime.timedelta(minutes=30), |
|
40 | 35 | 'label': '30 minutes', 'minutes': 30} |
|
41 | 36 | time_deltas['1h'] = {'delta': datetime.timedelta(hours=1), |
|
42 | 37 | 'label': '60 minutes', 'minutes': 60} |
|
43 | 38 | time_deltas['4h'] = {'delta': datetime.timedelta(hours=4), 'label': '4 hours', |
|
44 | 39 | 'minutes': 60 * 4} |
|
45 | 40 | time_deltas['12h'] = {'delta': datetime.timedelta(hours=12), |
|
46 | 41 | 'label': '12 hours', 'minutes': 60 * 12} |
|
47 | 42 | time_deltas['24h'] = {'delta': datetime.timedelta(hours=24), |
|
48 | 43 | 'label': '24 hours', 'minutes': 60 * 24} |
|
49 | 44 | time_deltas['3d'] = {'delta': datetime.timedelta(days=3), 'label': '3 days', |
|
50 | 45 | 'minutes': 60 * 24 * 3} |
|
51 | 46 | time_deltas['1w'] = {'delta': datetime.timedelta(days=7), 'label': '7 days', |
|
52 | 47 | 'minutes': 60 * 24 * 7} |
|
53 | 48 | time_deltas['2w'] = {'delta': datetime.timedelta(days=14), 'label': '14 days', |
|
54 | 49 | 'minutes': 60 * 24 * 14} |
|
55 | 50 | time_deltas['1M'] = {'delta': datetime.timedelta(days=31), 'label': '31 days', |
|
56 | 51 | 'minutes': 60 * 24 * 31} |
|
57 | 52 | time_deltas['3M'] = {'delta': datetime.timedelta(days=31 * 3), |
|
58 | 53 | 'label': '3 months', |
|
59 | 54 | 'minutes': 60 * 24 * 31 * 3} |
|
60 | 55 | time_deltas['6M'] = {'delta': datetime.timedelta(days=31 * 6), |
|
61 | 56 | 'label': '6 months', |
|
62 | 57 | 'minutes': 60 * 24 * 31 * 6} |
|
63 | 58 | time_deltas['12M'] = {'delta': datetime.timedelta(days=31 * 12), |
|
64 | 59 | 'label': '12 months', |
|
65 | 60 | 'minutes': 60 * 24 * 31 * 12} |
|
66 | 61 | |
|
67 | 62 | # used in json representation |
|
68 | 63 | time_options = dict([(k, {'label': v['label'], 'minutes': v['minutes']}) |
|
69 | 64 | for k, v in time_deltas.items()]) |
|
70 | 65 | FlashMsg = namedtuple('FlashMsg', ['msg', 'level']) |
|
71 | 66 | |
|
72 | 67 | |
|
73 | 68 | def get_flash(request): |
|
74 | 69 | messages = [] |
|
75 | 70 | messages.extend( |
|
76 | 71 | [FlashMsg(msg, 'error') |
|
77 | 72 | for msg in request.session.peek_flash('error')]) |
|
78 | 73 | messages.extend([FlashMsg(msg, 'warning') |
|
79 | 74 | for msg in request.session.peek_flash('warning')]) |
|
80 | 75 | messages.extend( |
|
81 | 76 | [FlashMsg(msg, 'notice') for msg in request.session.peek_flash()]) |
|
82 | 77 | return messages |
|
83 | 78 | |
|
84 | 79 | |
|
85 | 80 | def clear_flash(request): |
|
86 | 81 | request.session.pop_flash('error') |
|
87 | 82 | request.session.pop_flash('warning') |
|
88 | 83 | request.session.pop_flash() |
|
89 | 84 | |
|
90 | 85 | |
|
91 | 86 | def get_type_formatted_flash(request): |
|
92 | 87 | return [{'msg': message.msg, 'type': message.level} |
|
93 | 88 | for message in get_flash(request)] |
|
94 | 89 | |
|
95 | 90 | |
|
96 | 91 | def gen_pagination_headers(request, paginator): |
|
97 | 92 | headers = { |
|
98 | 93 | 'x-total-count': str(paginator.item_count), |
|
99 | 94 | 'x-current-page': str(paginator.page), |
|
100 | 95 | 'x-items-per-page': str(paginator.items_per_page) |
|
101 | 96 | } |
|
102 | 97 | params_dict = request.GET.dict_of_lists() |
|
103 | 98 | last_page_params = copy.deepcopy(params_dict) |
|
104 | 99 | last_page_params['page'] = paginator.last_page or 1 |
|
105 | 100 | first_page_params = copy.deepcopy(params_dict) |
|
106 | 101 | first_page_params.pop('page', None) |
|
107 | 102 | next_page_params = copy.deepcopy(params_dict) |
|
108 | 103 | next_page_params['page'] = paginator.next_page or paginator.last_page or 1 |
|
109 | 104 | prev_page_params = copy.deepcopy(params_dict) |
|
110 | 105 | prev_page_params['page'] = paginator.previous_page or 1 |
|
111 | 106 | lp_url = request.current_route_url(_query=last_page_params) |
|
112 | 107 | fp_url = request.current_route_url(_query=first_page_params) |
|
113 | 108 | links = [ |
|
114 | 109 | 'rel="last", <{}>'.format(lp_url), |
|
115 | 110 | 'rel="first", <{}>'.format(fp_url), |
|
116 | 111 | ] |
|
117 | 112 | if first_page_params != prev_page_params: |
|
118 | 113 | prev_url = request.current_route_url(_query=prev_page_params) |
|
119 | 114 | links.append('rel="prev", <{}>'.format(prev_url)) |
|
120 | 115 | if last_page_params != next_page_params: |
|
121 | 116 | next_url = request.current_route_url(_query=next_page_params) |
|
122 | 117 | links.append('rel="next", <{}>'.format(next_url)) |
|
123 | 118 | headers['link'] = '; '.join(links) |
|
124 | 119 | return headers |
@@ -1,51 +1,46 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import re |
|
23 | 18 | from appenlight.lib.ext_json import json |
|
24 | 19 | from jinja2 import Markup, escape, evalcontextfilter |
|
25 | 20 | |
|
26 | 21 | _paragraph_re = re.compile(r'(?:\r\n|\r|\n){2,}') |
|
27 | 22 | |
|
28 | 23 | |
|
29 | 24 | @evalcontextfilter |
|
30 | 25 | def nl2br(eval_ctx, value): |
|
31 | 26 | if eval_ctx.autoescape: |
|
32 | 27 | result = '\n\n'.join('<p>%s</p>' % p.replace('\n', Markup('<br>\n')) |
|
33 | 28 | for p in _paragraph_re.split(escape(value))) |
|
34 | 29 | else: |
|
35 | 30 | result = '\n\n'.join('<p>%s</p>' % p.replace('\n', '<br>\n') |
|
36 | 31 | for p in _paragraph_re.split(escape(value))) |
|
37 | 32 | if eval_ctx.autoescape: |
|
38 | 33 | result = Markup(result) |
|
39 | 34 | return result |
|
40 | 35 | |
|
41 | 36 | |
|
42 | 37 | @evalcontextfilter |
|
43 | 38 | def toJSONUnsafe(eval_ctx, value): |
|
44 | 39 | encoded = json.dumps(value).replace('&', '\\u0026') \ |
|
45 | 40 | .replace('<', '\\u003c') \ |
|
46 | 41 | .replace('>', '\\u003e') \ |
|
47 | 42 | .replace('>', '\\u003e') \ |
|
48 | 43 | .replace('"', '\\u0022') \ |
|
49 | 44 | .replace("'", '\\u0027') \ |
|
50 | 45 | .replace(r'\n', '/\\\n') |
|
51 | 46 | return Markup("'%s'" % encoded) |
@@ -1,69 +1,64 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import json |
|
23 | 18 | import logging |
|
24 | 19 | |
|
25 | 20 | ignored_keys = ['args', 'asctime', 'created', 'exc_info', 'exc_text', |
|
26 | 21 | 'filename', 'funcName', 'levelname', 'levelno', 'lineno', |
|
27 | 22 | 'message', 'module', 'msecs', 'msg', 'name', 'pathname', |
|
28 | 23 | 'process', 'processName', 'relativeCreated', 'stack_info', |
|
29 | 24 | 'thread', 'threadName'] |
|
30 | 25 | |
|
31 | 26 | |
|
32 | 27 | class JSONFormatter(logging.Formatter): |
|
33 | 28 | def format(self, record): |
|
34 | 29 | """ |
|
35 | 30 | Format the specified record as text. |
|
36 | 31 | |
|
37 | 32 | The record's attribute dictionary is used as the operand to a |
|
38 | 33 | string formatting operation which yields the returned string. |
|
39 | 34 | Before formatting the dictionary, a couple of preparatory steps |
|
40 | 35 | are carried out. The message attribute of the record is computed |
|
41 | 36 | using LogRecord.getMessage(). If the formatting string uses the |
|
42 | 37 | time (as determined by a call to usesTime(), formatTime() is |
|
43 | 38 | called to format the event time. If there is exception information, |
|
44 | 39 | it is formatted using formatException() and appended to the message. |
|
45 | 40 | """ |
|
46 | 41 | record.message = record.getMessage() |
|
47 | 42 | log_dict = vars(record) |
|
48 | 43 | keys = [k for k in log_dict.keys() if k not in ignored_keys] |
|
49 | 44 | payload = {'message': record.message} |
|
50 | 45 | payload.update({k: log_dict[k] for k in keys}) |
|
51 | 46 | record.message = json.dumps(payload, default=lambda x: str(x)) |
|
52 | 47 | |
|
53 | 48 | if self.usesTime(): |
|
54 | 49 | record.asctime = self.formatTime(record, self.datefmt) |
|
55 | 50 | s = self.formatMessage(record) |
|
56 | 51 | if record.exc_info: |
|
57 | 52 | # Cache the traceback text to avoid converting it multiple times |
|
58 | 53 | # (it's constant anyway) |
|
59 | 54 | if not record.exc_text: |
|
60 | 55 | record.exc_text = self.formatException(record.exc_info) |
|
61 | 56 | if record.exc_text: |
|
62 | 57 | if s[-1:] != "\n": |
|
63 | 58 | s = s + "\n" |
|
64 | 59 | s = s + record.exc_text |
|
65 | 60 | if record.stack_info: |
|
66 | 61 | if s[-1:] != "\n": |
|
67 | 62 | s = s + "\n" |
|
68 | 63 | s = s + self.formatStack(record.stack_info) |
|
69 | 64 | return s |
@@ -1,70 +1,65 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | BASE = 'appenlight:data:{}' |
|
23 | 18 | |
|
24 | 19 | REDIS_KEYS = { |
|
25 | 20 | 'tasks': { |
|
26 | 21 | 'add_reports_lock': BASE.format('add_reports_lock:{}'), |
|
27 | 22 | 'add_logs_lock': BASE.format('add_logs_lock:{}'), |
|
28 | 23 | }, |
|
29 | 24 | 'counters': { |
|
30 | 25 | 'events_per_minute_per_user': BASE.format( |
|
31 | 26 | 'events_per_minute_per_user:{}:{}'), |
|
32 | 27 | 'reports_per_minute': BASE.format('reports_per_minute:{}'), |
|
33 | 28 | 'reports_per_hour_per_app': BASE.format( |
|
34 | 29 | 'reports_per_hour_per_app:{}:{}'), |
|
35 | 30 | 'reports_per_type': BASE.format('reports_per_type:{}'), |
|
36 | 31 | 'logs_per_minute': BASE.format('logs_per_minute:{}'), |
|
37 | 32 | 'logs_per_hour_per_app': BASE.format( |
|
38 | 33 | 'logs_per_hour_per_app:{}:{}'), |
|
39 | 34 | 'metrics_per_minute': BASE.format('metrics_per_minute:{}'), |
|
40 | 35 | 'metrics_per_hour_per_app': BASE.format( |
|
41 | 36 | 'metrics_per_hour_per_app:{}:{}'), |
|
42 | 37 | 'report_group_occurences': BASE.format('report_group_occurences:{}'), |
|
43 | 38 | 'report_group_occurences_alerting': BASE.format( |
|
44 | 39 | 'report_group_occurences_alerting:{}'), |
|
45 | 40 | 'report_group_occurences_10th': BASE.format( |
|
46 | 41 | 'report_group_occurences_10th:{}'), |
|
47 | 42 | 'report_group_occurences_100th': BASE.format( |
|
48 | 43 | 'report_group_occurences_100th:{}'), |
|
49 | 44 | }, |
|
50 | 45 | 'rate_limits': { |
|
51 | 46 | 'per_application_reports_rate_limit': BASE.format( |
|
52 | 47 | 'per_application_reports_limit:{}:{}'), |
|
53 | 48 | 'per_application_logs_rate_limit': BASE.format( |
|
54 | 49 | 'per_application_logs_rate_limit:{}:{}'), |
|
55 | 50 | 'per_application_metrics_rate_limit': BASE.format( |
|
56 | 51 | 'per_application_metrics_rate_limit:{}:{}'), |
|
57 | 52 | }, |
|
58 | 53 | 'apps_that_got_new_data_per_hour': BASE.format('apps_that_got_new_data_per_hour:{}'), |
|
59 | 54 | 'apps_that_had_reports': BASE.format('apps_that_had_reports'), |
|
60 | 55 | 'apps_that_had_error_reports': BASE.format('apps_that_had_error_reports'), |
|
61 | 56 | 'apps_that_had_reports_alerting': BASE.format( |
|
62 | 57 | 'apps_that_had_reports_alerting'), |
|
63 | 58 | 'apps_that_had_error_reports_alerting': BASE.format( |
|
64 | 59 | 'apps_that_had_error_reports_alerting'), |
|
65 | 60 | 'reports_to_notify_per_type_per_app': BASE.format( |
|
66 | 61 | 'reports_to_notify_per_type_per_app:{}:{}'), |
|
67 | 62 | 'reports_to_notify_per_type_per_app_alerting': BASE.format( |
|
68 | 63 | 'reports_to_notify_per_type_per_app_alerting:{}:{}'), |
|
69 | 64 | 'seen_tag_list': BASE.format('seen_tag_list') |
|
70 | 65 | } |
@@ -1,140 +1,135 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import json |
|
23 | 18 | |
|
24 | 19 | from pyramid.security import unauthenticated_userid |
|
25 | 20 | |
|
26 | 21 | import appenlight.lib.helpers as helpers |
|
27 | 22 | |
|
28 | 23 | from authomatic.providers import oauth2, oauth1 |
|
29 | 24 | from authomatic import Authomatic |
|
30 | 25 | from appenlight.models.user import User |
|
31 | 26 | |
|
32 | 27 | |
|
33 | 28 | class CSRFException(Exception): |
|
34 | 29 | pass |
|
35 | 30 | |
|
36 | 31 | |
|
37 | 32 | class JSONException(Exception): |
|
38 | 33 | pass |
|
39 | 34 | |
|
40 | 35 | |
|
41 | 36 | def get_csrf_token(request): |
|
42 | 37 | return request.session.get_csrf_token() |
|
43 | 38 | |
|
44 | 39 | |
|
45 | 40 | def safe_json_body(request): |
|
46 | 41 | """ |
|
47 | 42 | Returns None if json body is missing or erroneous |
|
48 | 43 | """ |
|
49 | 44 | try: |
|
50 | 45 | return request.json_body |
|
51 | 46 | except ValueError: |
|
52 | 47 | return None |
|
53 | 48 | |
|
54 | 49 | |
|
55 | 50 | def unsafe_json_body(request): |
|
56 | 51 | """ |
|
57 | 52 | Throws JSONException if json can't deserialize |
|
58 | 53 | """ |
|
59 | 54 | try: |
|
60 | 55 | return request.json_body |
|
61 | 56 | except ValueError: |
|
62 | 57 | raise JSONException('Incorrect JSON') |
|
63 | 58 | |
|
64 | 59 | |
|
65 | 60 | def get_user(request): |
|
66 | 61 | if not request.path_info.startswith('/static'): |
|
67 | 62 | user_id = unauthenticated_userid(request) |
|
68 | 63 | try: |
|
69 | 64 | user_id = int(user_id) |
|
70 | 65 | except Exception: |
|
71 | 66 | return None |
|
72 | 67 | |
|
73 | 68 | if user_id: |
|
74 | 69 | user = User.by_id(user_id) |
|
75 | 70 | if user: |
|
76 | 71 | request.environ['appenlight.username'] = '%d:%s' % ( |
|
77 | 72 | user_id, user.user_name) |
|
78 | 73 | return user |
|
79 | 74 | else: |
|
80 | 75 | return None |
|
81 | 76 | |
|
82 | 77 | |
|
83 | 78 | def es_conn(request): |
|
84 | 79 | return request.registry.es_conn |
|
85 | 80 | |
|
86 | 81 | |
|
87 | 82 | def add_flash_to_headers(request, clear=True): |
|
88 | 83 | """ |
|
89 | 84 | Adds pending flash messages to response, if clear is true clears out the |
|
90 | 85 | flash queue |
|
91 | 86 | """ |
|
92 | 87 | flash_msgs = helpers.get_type_formatted_flash(request) |
|
93 | 88 | request.response.headers['x-flash-messages'] = json.dumps(flash_msgs) |
|
94 | 89 | helpers.clear_flash(request) |
|
95 | 90 | |
|
96 | 91 | |
|
97 | 92 | def get_authomatic(request): |
|
98 | 93 | settings = request.registry.settings |
|
99 | 94 | # authomatic social auth |
|
100 | 95 | authomatic_conf = { |
|
101 | 96 | # callback http://yourapp.com/social_auth/twitter |
|
102 | 97 | 'twitter': { |
|
103 | 98 | 'class_': oauth1.Twitter, |
|
104 | 99 | 'consumer_key': settings.get('authomatic.pr.twitter.key', ''), |
|
105 | 100 | 'consumer_secret': settings.get('authomatic.pr.twitter.secret', |
|
106 | 101 | ''), |
|
107 | 102 | }, |
|
108 | 103 | # callback http://yourapp.com/social_auth/facebook |
|
109 | 104 | 'facebook': { |
|
110 | 105 | 'class_': oauth2.Facebook, |
|
111 | 106 | 'consumer_key': settings.get('authomatic.pr.facebook.app_id', ''), |
|
112 | 107 | 'consumer_secret': settings.get('authomatic.pr.facebook.secret', |
|
113 | 108 | ''), |
|
114 | 109 | 'scope': ['email'], |
|
115 | 110 | }, |
|
116 | 111 | # callback http://yourapp.com/social_auth/google |
|
117 | 112 | 'google': { |
|
118 | 113 | 'class_': oauth2.Google, |
|
119 | 114 | 'consumer_key': settings.get('authomatic.pr.google.key', ''), |
|
120 | 115 | 'consumer_secret': settings.get( |
|
121 | 116 | 'authomatic.pr.google.secret', ''), |
|
122 | 117 | 'scope': ['profile', 'email'], |
|
123 | 118 | }, |
|
124 | 119 | 'github': { |
|
125 | 120 | 'class_': oauth2.GitHub, |
|
126 | 121 | 'consumer_key': settings.get('authomatic.pr.github.key', ''), |
|
127 | 122 | 'consumer_secret': settings.get( |
|
128 | 123 | 'authomatic.pr.github.secret', ''), |
|
129 | 124 | 'scope': ['repo', 'public_repo', 'user:email'], |
|
130 | 125 | 'access_headers': {'User-Agent': 'AppEnlight'}, |
|
131 | 126 | }, |
|
132 | 127 | 'bitbucket': { |
|
133 | 128 | 'class_': oauth1.Bitbucket, |
|
134 | 129 | 'consumer_key': settings.get('authomatic.pr.bitbucket.key', ''), |
|
135 | 130 | 'consumer_secret': settings.get( |
|
136 | 131 | 'authomatic.pr.bitbucket.secret', '') |
|
137 | 132 | } |
|
138 | 133 | } |
|
139 | 134 | return Authomatic( |
|
140 | 135 | config=authomatic_conf, secret=settings['authomatic.secret']) |
@@ -1,303 +1,298 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import logging |
|
23 | 18 | import operator |
|
24 | 19 | |
|
25 | 20 | log = logging.getLogger(__name__) |
|
26 | 21 | |
|
27 | 22 | |
|
28 | 23 | class RuleException(Exception): |
|
29 | 24 | pass |
|
30 | 25 | |
|
31 | 26 | |
|
32 | 27 | class KeyNotFoundException(RuleException): |
|
33 | 28 | pass |
|
34 | 29 | |
|
35 | 30 | |
|
36 | 31 | class UnknownTypeException(RuleException): |
|
37 | 32 | pass |
|
38 | 33 | |
|
39 | 34 | |
|
40 | 35 | class BadConfigException(RuleException): |
|
41 | 36 | pass |
|
42 | 37 | |
|
43 | 38 | |
|
44 | 39 | class InvalidValueException(RuleException): |
|
45 | 40 | pass |
|
46 | 41 | |
|
47 | 42 | |
|
48 | 43 | class RuleBase(object): |
|
49 | 44 | @classmethod |
|
50 | 45 | def default_dict_struct_getter(cls, struct, field_name): |
|
51 | 46 | """ |
|
52 | 47 | returns a key from dictionary based on field_name, if the name contains |
|
53 | 48 | `:` then it means additional nesting levels should be checked for the |
|
54 | 49 | key so `a:b:c` means return struct['a']['b']['c'] |
|
55 | 50 | |
|
56 | 51 | :param struct: |
|
57 | 52 | :param field_name: |
|
58 | 53 | :return: |
|
59 | 54 | """ |
|
60 | 55 | parts = field_name.split(':') if field_name else [] |
|
61 | 56 | found = struct |
|
62 | 57 | while parts: |
|
63 | 58 | current_key = parts.pop(0) |
|
64 | 59 | found = found.get(current_key) |
|
65 | 60 | if not found and parts: |
|
66 | 61 | raise KeyNotFoundException('Key not found in structure') |
|
67 | 62 | return found |
|
68 | 63 | |
|
69 | 64 | @classmethod |
|
70 | 65 | def default_obj_struct_getter(cls, struct, field_name): |
|
71 | 66 | """ |
|
72 | 67 | returns a key from instance based on field_name, if the name contains |
|
73 | 68 | `:` then it means additional nesting levels should be checked for the |
|
74 | 69 | key so `a:b:c` means return struct.a.b.c |
|
75 | 70 | |
|
76 | 71 | :param struct: |
|
77 | 72 | :param field_name: |
|
78 | 73 | :return: |
|
79 | 74 | """ |
|
80 | 75 | parts = field_name.split(':') |
|
81 | 76 | found = struct |
|
82 | 77 | while parts: |
|
83 | 78 | current_key = parts.pop(0) |
|
84 | 79 | found = getattr(found, current_key, None) |
|
85 | 80 | if not found and parts: |
|
86 | 81 | raise KeyNotFoundException('Key not found in structure') |
|
87 | 82 | return found |
|
88 | 83 | |
|
89 | 84 | def normalized_type(self, field, value): |
|
90 | 85 | """ |
|
91 | 86 | Converts text values from self.conf_value based on type_matrix below |
|
92 | 87 | check_matrix defines what kind of checks we can perform on a field |
|
93 | 88 | value based on field name |
|
94 | 89 | """ |
|
95 | 90 | f_type = self.type_matrix.get(field) |
|
96 | 91 | if f_type: |
|
97 | 92 | cast_to = f_type['type'] |
|
98 | 93 | else: |
|
99 | 94 | raise UnknownTypeException('Unknown type') |
|
100 | 95 | |
|
101 | 96 | if value is None: |
|
102 | 97 | return None |
|
103 | 98 | |
|
104 | 99 | try: |
|
105 | 100 | if cast_to == 'int': |
|
106 | 101 | return int(value) |
|
107 | 102 | elif cast_to == 'float': |
|
108 | 103 | return float(value) |
|
109 | 104 | elif cast_to == 'unicode': |
|
110 | 105 | return str(value) |
|
111 | 106 | except ValueError as exc: |
|
112 | 107 | raise InvalidValueException(exc) |
|
113 | 108 | |
|
114 | 109 | |
|
115 | 110 | class Rule(RuleBase): |
|
116 | 111 | def __init__(self, config, type_matrix, |
|
117 | 112 | struct_getter=RuleBase.default_dict_struct_getter, |
|
118 | 113 | config_manipulator=None): |
|
119 | 114 | """ |
|
120 | 115 | |
|
121 | 116 | :param config: dict - contains rule configuration |
|
122 | 117 | example:: |
|
123 | 118 | { |
|
124 | 119 | "field": "__OR__", |
|
125 | 120 | "rules": [ |
|
126 | 121 | { |
|
127 | 122 | "field": "__AND__", |
|
128 | 123 | "rules": [ |
|
129 | 124 | { |
|
130 | 125 | "op": "ge", |
|
131 | 126 | "field": "occurences", |
|
132 | 127 | "value": "10" |
|
133 | 128 | }, |
|
134 | 129 | { |
|
135 | 130 | "op": "ge", |
|
136 | 131 | "field": "priority", |
|
137 | 132 | "value": "4" |
|
138 | 133 | } |
|
139 | 134 | ] |
|
140 | 135 | }, |
|
141 | 136 | { |
|
142 | 137 | "op": "eq", |
|
143 | 138 | "field": "http_status", |
|
144 | 139 | "value": "500" |
|
145 | 140 | } |
|
146 | 141 | ] |
|
147 | 142 | } |
|
148 | 143 | :param type_matrix: dict - contains map of type casts |
|
149 | 144 | example:: |
|
150 | 145 | { |
|
151 | 146 | 'http_status': 'int', |
|
152 | 147 | 'priority': 'unicode', |
|
153 | 148 | } |
|
154 | 149 | :param struct_getter: callable - used to grab the value of field from |
|
155 | 150 | the structure passed to match() based |
|
156 | 151 | on key, default |
|
157 | 152 | |
|
158 | 153 | """ |
|
159 | 154 | self.type_matrix = type_matrix |
|
160 | 155 | self.config = config |
|
161 | 156 | self.struct_getter = struct_getter |
|
162 | 157 | self.config_manipulator = config_manipulator |
|
163 | 158 | if config_manipulator: |
|
164 | 159 | config_manipulator(self) |
|
165 | 160 | |
|
166 | 161 | def subrule_check(self, rule_config, struct): |
|
167 | 162 | rule = Rule(rule_config, self.type_matrix, |
|
168 | 163 | config_manipulator=self.config_manipulator) |
|
169 | 164 | return rule.match(struct) |
|
170 | 165 | |
|
171 | 166 | def match(self, struct): |
|
172 | 167 | """ |
|
173 | 168 | Check if rule matched for this specific report |
|
174 | 169 | First tries report value, then tests tags in not found, then finally |
|
175 | 170 | report group |
|
176 | 171 | """ |
|
177 | 172 | field_name = self.config.get('field') |
|
178 | 173 | test_value = self.config.get('value') |
|
179 | 174 | |
|
180 | 175 | if not field_name: |
|
181 | 176 | return False |
|
182 | 177 | |
|
183 | 178 | if field_name == '__AND__': |
|
184 | 179 | rule = AND(self.config['rules'], self.type_matrix, |
|
185 | 180 | config_manipulator=self.config_manipulator) |
|
186 | 181 | return rule.match(struct) |
|
187 | 182 | elif field_name == '__OR__': |
|
188 | 183 | rule = OR(self.config['rules'], self.type_matrix, |
|
189 | 184 | config_manipulator=self.config_manipulator) |
|
190 | 185 | return rule.match(struct) |
|
191 | 186 | elif field_name == '__NOT__': |
|
192 | 187 | rule = NOT(self.config['rules'], self.type_matrix, |
|
193 | 188 | config_manipulator=self.config_manipulator) |
|
194 | 189 | return rule.match(struct) |
|
195 | 190 | |
|
196 | 191 | if test_value is None: |
|
197 | 192 | return False |
|
198 | 193 | |
|
199 | 194 | try: |
|
200 | 195 | struct_value = self.normalized_type(field_name, |
|
201 | 196 | self.struct_getter(struct, |
|
202 | 197 | field_name)) |
|
203 | 198 | except (UnknownTypeException, InvalidValueException) as exc: |
|
204 | 199 | log.error(str(exc)) |
|
205 | 200 | return False |
|
206 | 201 | |
|
207 | 202 | try: |
|
208 | 203 | test_value = self.normalized_type(field_name, test_value) |
|
209 | 204 | except (UnknownTypeException, InvalidValueException) as exc: |
|
210 | 205 | log.error(str(exc)) |
|
211 | 206 | return False |
|
212 | 207 | |
|
213 | 208 | if self.config['op'] not in ('startswith', 'endswith', 'contains'): |
|
214 | 209 | try: |
|
215 | 210 | return getattr(operator, |
|
216 | 211 | self.config['op'])(struct_value, test_value) |
|
217 | 212 | except TypeError: |
|
218 | 213 | return False |
|
219 | 214 | elif self.config['op'] == 'startswith': |
|
220 | 215 | return struct_value.startswith(test_value) |
|
221 | 216 | elif self.config['op'] == 'endswith': |
|
222 | 217 | return struct_value.endswith(test_value) |
|
223 | 218 | elif self.config['op'] == 'contains': |
|
224 | 219 | return test_value in struct_value |
|
225 | 220 | raise BadConfigException('Invalid configuration, ' |
|
226 | 221 | 'unknown operator: {}'.format(self.config)) |
|
227 | 222 | |
|
228 | 223 | def __repr__(self): |
|
229 | 224 | return '<Rule {} {}>'.format(self.config.get('field'), |
|
230 | 225 | self.config.get('value')) |
|
231 | 226 | |
|
232 | 227 | |
|
233 | 228 | class AND(Rule): |
|
234 | 229 | def __init__(self, rules, *args, **kwargs): |
|
235 | 230 | super(AND, self).__init__({}, *args, **kwargs) |
|
236 | 231 | self.rules = rules |
|
237 | 232 | |
|
238 | 233 | def match(self, struct): |
|
239 | 234 | return all([self.subrule_check(r_conf, struct) for r_conf |
|
240 | 235 | in self.rules]) |
|
241 | 236 | |
|
242 | 237 | |
|
243 | 238 | class NOT(Rule): |
|
244 | 239 | def __init__(self, rules, *args, **kwargs): |
|
245 | 240 | super(NOT, self).__init__({}, *args, **kwargs) |
|
246 | 241 | self.rules = rules |
|
247 | 242 | |
|
248 | 243 | def match(self, struct): |
|
249 | 244 | return all([not self.subrule_check(r_conf, struct) for r_conf |
|
250 | 245 | in self.rules]) |
|
251 | 246 | |
|
252 | 247 | |
|
253 | 248 | class OR(Rule): |
|
254 | 249 | def __init__(self, rules, *args, **kwargs): |
|
255 | 250 | super(OR, self).__init__({}, *args, **kwargs) |
|
256 | 251 | self.rules = rules |
|
257 | 252 | |
|
258 | 253 | def match(self, struct): |
|
259 | 254 | return any([self.subrule_check(r_conf, struct) for r_conf |
|
260 | 255 | in self.rules]) |
|
261 | 256 | |
|
262 | 257 | |
|
263 | 258 | class RuleService(object): |
|
264 | 259 | @staticmethod |
|
265 | 260 | def rule_from_config(config, field_mappings, labels_dict, |
|
266 | 261 | manipulator_func=None): |
|
267 | 262 | """ |
|
268 | 263 | Returns modified rule with manipulator function |
|
269 | 264 | By default manipulator function replaces field id from labels_dict |
|
270 | 265 | with current field id proper for the rule from fields_mappings |
|
271 | 266 | |
|
272 | 267 | because label X_X id might be pointing different value on next request |
|
273 | 268 | when new term is returned from elasticsearch - this ensures things |
|
274 | 269 | are kept 1:1 all the time |
|
275 | 270 | """ |
|
276 | 271 | rev_map = {} |
|
277 | 272 | for k, v in labels_dict.items(): |
|
278 | 273 | rev_map[(v['agg'], v['key'],)] = k |
|
279 | 274 | |
|
280 | 275 | if manipulator_func is None: |
|
281 | 276 | def label_rewriter_func(rule): |
|
282 | 277 | field = rule.config.get('field') |
|
283 | 278 | if not field or rule.config['field'] in ['__OR__', |
|
284 | 279 | '__AND__', '__NOT__']: |
|
285 | 280 | return |
|
286 | 281 | |
|
287 | 282 | to_map = field_mappings.get(rule.config['field']) |
|
288 | 283 | |
|
289 | 284 | # we need to replace series field with _AE_NOT_FOUND_ to not match |
|
290 | 285 | # accidently some other field which happens to have the series that |
|
291 | 286 | # was used when the alert was created |
|
292 | 287 | if to_map: |
|
293 | 288 | to_replace = rev_map.get((to_map['agg'], to_map['key'],), |
|
294 | 289 | '_AE_NOT_FOUND_') |
|
295 | 290 | else: |
|
296 | 291 | to_replace = '_AE_NOT_FOUND_' |
|
297 | 292 | |
|
298 | 293 | rule.config['field'] = to_replace |
|
299 | 294 | rule.type_matrix[to_replace] = {"type": 'float'} |
|
300 | 295 | |
|
301 | 296 | manipulator_func = label_rewriter_func |
|
302 | 297 | |
|
303 | 298 | return Rule(config, {}, config_manipulator=manipulator_func) |
@@ -1,65 +1,60 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | from ziggurat_foundations.models.services.external_identity import \ |
|
23 | 18 | ExternalIdentityService |
|
24 | 19 | from appenlight.models.external_identity import ExternalIdentity |
|
25 | 20 | |
|
26 | 21 | |
|
27 | 22 | def handle_social_data(request, user, social_data): |
|
28 | 23 | social_data = social_data |
|
29 | 24 | update_identity = False |
|
30 | 25 | |
|
31 | 26 | extng_id = ExternalIdentityService.by_external_id_and_provider( |
|
32 | 27 | social_data['user']['id'], |
|
33 | 28 | social_data['credentials'].provider_name |
|
34 | 29 | ) |
|
35 | 30 | |
|
36 | 31 | # fix legacy accounts with wrong google ID |
|
37 | 32 | if not extng_id and social_data['credentials'].provider_name == 'google': |
|
38 | 33 | extng_id = ExternalIdentityService.by_external_id_and_provider( |
|
39 | 34 | social_data['user']['email'], |
|
40 | 35 | social_data['credentials'].provider_name |
|
41 | 36 | ) |
|
42 | 37 | |
|
43 | 38 | if extng_id: |
|
44 | 39 | extng_id.delete() |
|
45 | 40 | update_identity = True |
|
46 | 41 | |
|
47 | 42 | if not social_data['user']['id']: |
|
48 | 43 | request.session.flash( |
|
49 | 44 | 'No external user id found? Perhaps permissions for ' |
|
50 | 45 | 'authentication are set incorrectly', 'error') |
|
51 | 46 | return False |
|
52 | 47 | |
|
53 | 48 | if not extng_id or update_identity: |
|
54 | 49 | if not update_identity: |
|
55 | 50 | request.session.flash('Your external identity is now ' |
|
56 | 51 | 'connected with your account') |
|
57 | 52 | ex_identity = ExternalIdentity() |
|
58 | 53 | ex_identity.external_id = social_data['user']['id'] |
|
59 | 54 | ex_identity.external_user_name = social_data['user']['user_name'] |
|
60 | 55 | ex_identity.provider_name = social_data['credentials'].provider_name |
|
61 | 56 | ex_identity.access_token = social_data['credentials'].token |
|
62 | 57 | ex_identity.token_secret = social_data['credentials'].token_secret |
|
63 | 58 | ex_identity.alt_token = social_data['credentials'].refresh_token |
|
64 | 59 | user.external_identities.append(ex_identity) |
|
65 | 60 | request.session.pop('zigg.social_auth', None) |
@@ -1,54 +1,49 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import binascii |
|
23 | 18 | import sqlalchemy.types as types |
|
24 | 19 | |
|
25 | 20 | import appenlight.lib.encryption as encryption |
|
26 | 21 | |
|
27 | 22 | |
|
28 | 23 | class BinaryHex(types.TypeDecorator): |
|
29 | 24 | impl = types.LargeBinary |
|
30 | 25 | |
|
31 | 26 | def process_bind_param(self, value, dialect): |
|
32 | 27 | if value is not None: |
|
33 | 28 | value = binascii.unhexlify(value) |
|
34 | 29 | |
|
35 | 30 | return value |
|
36 | 31 | |
|
37 | 32 | def process_result_value(self, value, dialect): |
|
38 | 33 | if value is not None: |
|
39 | 34 | value = binascii.hexlify(value) |
|
40 | 35 | return value |
|
41 | 36 | |
|
42 | 37 | |
|
43 | 38 | class EncryptedUnicode(types.TypeDecorator): |
|
44 | 39 | impl = types.Unicode |
|
45 | 40 | |
|
46 | 41 | def process_bind_param(self, value, dialect): |
|
47 | 42 | if not value: |
|
48 | 43 | return value |
|
49 | 44 | return encryption.encrypt_fernet(value) |
|
50 | 45 | |
|
51 | 46 | def process_result_value(self, value, dialect): |
|
52 | 47 | if not value: |
|
53 | 48 | return value |
|
54 | 49 | return encryption.decrypt_fernet(value) |
@@ -1,495 +1,490 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | """ |
|
23 | 18 | Utility functions. |
|
24 | 19 | """ |
|
25 | 20 | import logging |
|
26 | 21 | import requests |
|
27 | 22 | import hashlib |
|
28 | 23 | import json |
|
29 | 24 | import copy |
|
30 | 25 | import uuid |
|
31 | 26 | import appenlight.lib.helpers as h |
|
32 | 27 | from collections import namedtuple |
|
33 | 28 | from datetime import timedelta, datetime, date |
|
34 | 29 | from dogpile.cache.api import NO_VALUE |
|
35 | 30 | from appenlight.models import Datastores |
|
36 | 31 | from appenlight.validators import (LogSearchSchema, |
|
37 | 32 | TagListSchema, |
|
38 | 33 | accepted_search_params) |
|
39 | 34 | from itsdangerous import TimestampSigner |
|
40 | 35 | from ziggurat_foundations.permissions import ALL_PERMISSIONS |
|
41 | 36 | from dateutil.relativedelta import relativedelta |
|
42 | 37 | from dateutil.rrule import rrule, MONTHLY, DAILY |
|
43 | 38 | |
|
44 | 39 | log = logging.getLogger(__name__) |
|
45 | 40 | |
|
46 | 41 | |
|
47 | 42 | Stat = namedtuple('Stat', 'start_interval value') |
|
48 | 43 | |
|
49 | 44 | |
|
50 | 45 | def default_extractor(item): |
|
51 | 46 | """ |
|
52 | 47 | :param item - item to extract date from |
|
53 | 48 | """ |
|
54 | 49 | if hasattr(item, 'start_interval'): |
|
55 | 50 | return item.start_interval |
|
56 | 51 | return item['start_interval'] |
|
57 | 52 | |
|
58 | 53 | |
|
59 | 54 | # fast gap generator |
|
60 | 55 | def gap_gen_default(start, step, itemiterator, end_time=None, |
|
61 | 56 | iv_extractor=None): |
|
62 | 57 | """ generates a list of time/value items based on step and itemiterator |
|
63 | 58 | if there are entries missing from iterator time/None will be returned |
|
64 | 59 | instead |
|
65 | 60 | :param start - datetime - what time should we start generating our values |
|
66 | 61 | :param step - timedelta - stepsize |
|
67 | 62 | :param itemiterator - iterable - we will check this iterable for values |
|
68 | 63 | corresponding to generated steps |
|
69 | 64 | :param end_time - datetime - when last step is >= end_time stop iterating |
|
70 | 65 | :param iv_extractor - extracts current step from iterable items |
|
71 | 66 | """ |
|
72 | 67 | |
|
73 | 68 | if not iv_extractor: |
|
74 | 69 | iv_extractor = default_extractor |
|
75 | 70 | |
|
76 | 71 | next_step = start |
|
77 | 72 | minutes = step.total_seconds() / 60.0 |
|
78 | 73 | while next_step.minute % minutes != 0: |
|
79 | 74 | next_step = next_step.replace(minute=next_step.minute - 1) |
|
80 | 75 | for item in itemiterator: |
|
81 | 76 | item_start_interval = iv_extractor(item) |
|
82 | 77 | # do we have a match for current time step in our data? |
|
83 | 78 | # no gen a new tuple with 0 values |
|
84 | 79 | while next_step < item_start_interval: |
|
85 | 80 | yield Stat(next_step, None) |
|
86 | 81 | next_step = next_step + step |
|
87 | 82 | if next_step == item_start_interval: |
|
88 | 83 | yield Stat(item_start_interval, item) |
|
89 | 84 | next_step = next_step + step |
|
90 | 85 | if end_time: |
|
91 | 86 | while next_step < end_time: |
|
92 | 87 | yield Stat(next_step, None) |
|
93 | 88 | next_step = next_step + step |
|
94 | 89 | |
|
95 | 90 | |
|
96 | 91 | class DateTimeEncoder(json.JSONEncoder): |
|
97 | 92 | """ Simple datetime to ISO encoder for json serialization""" |
|
98 | 93 | |
|
99 | 94 | def default(self, obj): |
|
100 | 95 | if isinstance(obj, date): |
|
101 | 96 | return obj.isoformat() |
|
102 | 97 | if isinstance(obj, datetime): |
|
103 | 98 | return obj.isoformat() |
|
104 | 99 | return json.JSONEncoder.default(self, obj) |
|
105 | 100 | |
|
106 | 101 | |
|
107 | 102 | def channelstream_request(secret, endpoint, payload, throw_exceptions=False, |
|
108 | 103 | servers=None): |
|
109 | 104 | responses = [] |
|
110 | 105 | if not servers: |
|
111 | 106 | servers = [] |
|
112 | 107 | |
|
113 | 108 | signer = TimestampSigner(secret) |
|
114 | 109 | sig_for_server = signer.sign(endpoint) |
|
115 | 110 | for secret, server in [(s['secret'], s['server']) for s in servers]: |
|
116 | 111 | response = {} |
|
117 | 112 | secret_headers = {'x-channelstream-secret': sig_for_server, |
|
118 | 113 | 'x-channelstream-endpoint': endpoint, |
|
119 | 114 | 'Content-Type': 'application/json'} |
|
120 | 115 | url = '%s%s' % (server, endpoint) |
|
121 | 116 | try: |
|
122 | 117 | response = requests.post(url, |
|
123 | 118 | data=json.dumps(payload, |
|
124 | 119 | cls=DateTimeEncoder), |
|
125 | 120 | headers=secret_headers, |
|
126 | 121 | verify=False, |
|
127 | 122 | timeout=2).json() |
|
128 | 123 | except requests.exceptions.RequestException as e: |
|
129 | 124 | if throw_exceptions: |
|
130 | 125 | raise |
|
131 | 126 | responses.append(response) |
|
132 | 127 | return responses |
|
133 | 128 | |
|
134 | 129 | |
|
135 | 130 | def add_cors_headers(response): |
|
136 | 131 | # allow CORS |
|
137 | 132 | response.headers.add('Access-Control-Allow-Origin', '*') |
|
138 | 133 | response.headers.add('XDomainRequestAllowed', '1') |
|
139 | 134 | response.headers.add('Access-Control-Allow-Methods', 'GET, POST, OPTIONS') |
|
140 | 135 | # response.headers.add('Access-Control-Allow-Credentials', 'true') |
|
141 | 136 | response.headers.add('Access-Control-Allow-Headers', |
|
142 | 137 | 'Content-Type, Depth, User-Agent, X-File-Size, X-Requested-With, If-Modified-Since, X-File-Name, Cache-Control, Pragma, Origin, Connection, Referer, Cookie') |
|
143 | 138 | response.headers.add('Access-Control-Max-Age', '86400') |
|
144 | 139 | |
|
145 | 140 | |
|
146 | 141 | from sqlalchemy.sql import compiler |
|
147 | 142 | from psycopg2.extensions import adapt as sqlescape |
|
148 | 143 | |
|
149 | 144 | |
|
150 | 145 | # or use the appropiate escape function from your db driver |
|
151 | 146 | |
|
152 | 147 | def compile_query(query): |
|
153 | 148 | dialect = query.session.bind.dialect |
|
154 | 149 | statement = query.statement |
|
155 | 150 | comp = compiler.SQLCompiler(dialect, statement) |
|
156 | 151 | comp.compile() |
|
157 | 152 | enc = dialect.encoding |
|
158 | 153 | params = {} |
|
159 | 154 | for k, v in comp.params.items(): |
|
160 | 155 | if isinstance(v, str): |
|
161 | 156 | v = v.encode(enc) |
|
162 | 157 | params[k] = sqlescape(v) |
|
163 | 158 | return (comp.string.encode(enc) % params).decode(enc) |
|
164 | 159 | |
|
165 | 160 | |
|
166 | 161 | def convert_es_type(input_data): |
|
167 | 162 | """ |
|
168 | 163 | This might need to convert some text or other types to corresponding ES types |
|
169 | 164 | """ |
|
170 | 165 | return str(input_data) |
|
171 | 166 | |
|
172 | 167 | |
|
173 | 168 | ProtoVersion = namedtuple('ProtoVersion', ['major', 'minor', 'patch']) |
|
174 | 169 | |
|
175 | 170 | |
|
176 | 171 | def parse_proto(input_data): |
|
177 | 172 | try: |
|
178 | 173 | parts = [int(x) for x in input_data.split('.')] |
|
179 | 174 | while len(parts) < 3: |
|
180 | 175 | parts.append(0) |
|
181 | 176 | return ProtoVersion(*parts) |
|
182 | 177 | except Exception as e: |
|
183 | 178 | log.info('Unknown protocol version: %s' % e) |
|
184 | 179 | return ProtoVersion(99, 99, 99) |
|
185 | 180 | |
|
186 | 181 | |
|
187 | 182 | def es_index_name_limiter(start_date=None, end_date=None, months_in_past=6, |
|
188 | 183 | ixtypes=None): |
|
189 | 184 | """ |
|
190 | 185 | This function limits the search to 6 months by default so we don't have to |
|
191 | 186 | query 300 elasticsearch indices for 20 years of historical data for example |
|
192 | 187 | """ |
|
193 | 188 | |
|
194 | 189 | # should be cached later |
|
195 | 190 | def get_possible_names(): |
|
196 | 191 | return list(Datastores.es.aliases().keys()) |
|
197 | 192 | |
|
198 | 193 | possible_names = get_possible_names() |
|
199 | 194 | es_index_types = [] |
|
200 | 195 | if not ixtypes: |
|
201 | 196 | ixtypes = ['reports', 'metrics', 'logs'] |
|
202 | 197 | for t in ixtypes: |
|
203 | 198 | if t == 'reports': |
|
204 | 199 | es_index_types.append('rcae_r_%s') |
|
205 | 200 | elif t == 'logs': |
|
206 | 201 | es_index_types.append('rcae_l_%s') |
|
207 | 202 | elif t == 'metrics': |
|
208 | 203 | es_index_types.append('rcae_m_%s') |
|
209 | 204 | elif t == 'uptime': |
|
210 | 205 | es_index_types.append('rcae_u_%s') |
|
211 | 206 | elif t == 'slow_calls': |
|
212 | 207 | es_index_types.append('rcae_sc_%s') |
|
213 | 208 | |
|
214 | 209 | if start_date: |
|
215 | 210 | start_date = copy.copy(start_date) |
|
216 | 211 | else: |
|
217 | 212 | if not end_date: |
|
218 | 213 | end_date = datetime.utcnow() |
|
219 | 214 | start_date = end_date + relativedelta(months=months_in_past * -1) |
|
220 | 215 | |
|
221 | 216 | if not end_date: |
|
222 | 217 | end_date = start_date + relativedelta(months=months_in_past) |
|
223 | 218 | |
|
224 | 219 | index_dates = list(rrule(MONTHLY, |
|
225 | 220 | dtstart=start_date.date().replace(day=1), |
|
226 | 221 | until=end_date.date(), |
|
227 | 222 | count=36)) |
|
228 | 223 | index_names = [] |
|
229 | 224 | for ix_type in es_index_types: |
|
230 | 225 | to_extend = [ix_type % d.strftime('%Y_%m') for d in index_dates |
|
231 | 226 | if ix_type % d.strftime('%Y_%m') in possible_names] |
|
232 | 227 | index_names.extend(to_extend) |
|
233 | 228 | for day in list(rrule(DAILY, dtstart=start_date.date(), |
|
234 | 229 | until=end_date.date(), count=366)): |
|
235 | 230 | ix_name = ix_type % day.strftime('%Y_%m_%d') |
|
236 | 231 | if ix_name in possible_names: |
|
237 | 232 | index_names.append(ix_name) |
|
238 | 233 | return index_names |
|
239 | 234 | |
|
240 | 235 | |
|
241 | 236 | def build_filter_settings_from_query_dict( |
|
242 | 237 | request, params=None, override_app_ids=None, |
|
243 | 238 | resource_permissions=None): |
|
244 | 239 | """ |
|
245 | 240 | Builds list of normalized search terms for ES from query params |
|
246 | 241 | ensuring application list is restricted to only applications user |
|
247 | 242 | has access to |
|
248 | 243 | |
|
249 | 244 | :param params (dictionary) |
|
250 | 245 | :param override_app_ids - list of application id's to use instead of |
|
251 | 246 | applications user normally has access to |
|
252 | 247 | """ |
|
253 | 248 | params = copy.deepcopy(params) |
|
254 | 249 | applications = [] |
|
255 | 250 | if not resource_permissions: |
|
256 | 251 | resource_permissions = ['view'] |
|
257 | 252 | |
|
258 | 253 | if request.user: |
|
259 | 254 | applications = request.user.resources_with_perms( |
|
260 | 255 | resource_permissions, resource_types=['application']) |
|
261 | 256 | |
|
262 | 257 | # CRITICAL - this ensures our resultset is limited to only the ones |
|
263 | 258 | # user has view permissions |
|
264 | 259 | all_possible_app_ids = set([app.resource_id for app in applications]) |
|
265 | 260 | |
|
266 | 261 | # if override is preset we force permission for app to be present |
|
267 | 262 | # this allows users to see dashboards and applications they would |
|
268 | 263 | # normally not be able to |
|
269 | 264 | |
|
270 | 265 | if override_app_ids: |
|
271 | 266 | all_possible_app_ids = set(override_app_ids) |
|
272 | 267 | |
|
273 | 268 | schema = LogSearchSchema().bind(resources=all_possible_app_ids) |
|
274 | 269 | tag_schema = TagListSchema() |
|
275 | 270 | filter_settings = schema.deserialize(params) |
|
276 | 271 | tag_list = [] |
|
277 | 272 | for k, v in list(filter_settings.items()): |
|
278 | 273 | if k in accepted_search_params: |
|
279 | 274 | continue |
|
280 | 275 | tag_list.append({"name": k, "value": v, "op": 'eq'}) |
|
281 | 276 | # remove the key from filter_settings |
|
282 | 277 | filter_settings.pop(k, None) |
|
283 | 278 | tags = tag_schema.deserialize(tag_list) |
|
284 | 279 | filter_settings['tags'] = tags |
|
285 | 280 | return filter_settings |
|
286 | 281 | |
|
287 | 282 | |
|
288 | 283 | def gen_uuid(): |
|
289 | 284 | return str(uuid.uuid4()) |
|
290 | 285 | |
|
291 | 286 | |
|
292 | 287 | def gen_uuid4_sha_hex(): |
|
293 | 288 | return hashlib.sha1(uuid.uuid4().bytes).hexdigest() |
|
294 | 289 | |
|
295 | 290 | |
|
296 | 291 | def permission_tuple_to_dict(data): |
|
297 | 292 | out = { |
|
298 | 293 | "user_name": None, |
|
299 | 294 | "perm_name": data.perm_name, |
|
300 | 295 | "owner": data.owner, |
|
301 | 296 | "type": data.type, |
|
302 | 297 | "resource_name": None, |
|
303 | 298 | "resource_type": None, |
|
304 | 299 | "resource_id": None, |
|
305 | 300 | "group_name": None, |
|
306 | 301 | "group_id": None |
|
307 | 302 | } |
|
308 | 303 | if data.user: |
|
309 | 304 | out["user_name"] = data.user.user_name |
|
310 | 305 | if data.perm_name == ALL_PERMISSIONS: |
|
311 | 306 | out['perm_name'] = '__all_permissions__' |
|
312 | 307 | if data.resource: |
|
313 | 308 | out['resource_name'] = data.resource.resource_name |
|
314 | 309 | out['resource_type'] = data.resource.resource_type |
|
315 | 310 | out['resource_id'] = data.resource.resource_id |
|
316 | 311 | if data.group: |
|
317 | 312 | out['group_name'] = data.group.group_name |
|
318 | 313 | out['group_id'] = data.group.id |
|
319 | 314 | return out |
|
320 | 315 | |
|
321 | 316 | |
|
322 | 317 | def get_cached_buckets(request, stats_since, end_time, fn, cache_key, |
|
323 | 318 | gap_gen=None, db_session=None, step_interval=None, |
|
324 | 319 | iv_extractor=None, |
|
325 | 320 | rerange=False, *args, **kwargs): |
|
326 | 321 | """ Takes "fn" that should return some data and tries to load the data |
|
327 | 322 | dividing it into daily buckets - if the stats_since and end time give a |
|
328 | 323 | delta bigger than 24hours, then only "todays" data is computed on the fly |
|
329 | 324 | |
|
330 | 325 | :param request: (request) request object |
|
331 | 326 | :param stats_since: (datetime) start date of buckets range |
|
332 | 327 | :param end_time: (datetime) end date of buckets range - utcnow() if None |
|
333 | 328 | :param fn: (callable) callable to use to populate buckets should have |
|
334 | 329 | following signature: |
|
335 | 330 | def get_data(request, since_when, until, *args, **kwargs): |
|
336 | 331 | |
|
337 | 332 | :param cache_key: (string) cache key that will be used to build bucket |
|
338 | 333 | caches |
|
339 | 334 | :param gap_gen: (callable) gap generator - should return step intervals |
|
340 | 335 | to use with out `fn` callable |
|
341 | 336 | :param db_session: (Session) sqlalchemy session |
|
342 | 337 | :param step_interval: (timedelta) optional step interval if we want to |
|
343 | 338 | override the default determined from total start/end time delta |
|
344 | 339 | :param iv_extractor: (callable) used to get step intervals from data |
|
345 | 340 | returned by `fn` callable |
|
346 | 341 | :param rerange: (bool) handy if we want to change ranges from hours to |
|
347 | 342 | days when cached data is missing - will shorten execution time if `fn` |
|
348 | 343 | callable supports that and we are working with multiple rows - like metrics |
|
349 | 344 | :param args: |
|
350 | 345 | :param kwargs: |
|
351 | 346 | |
|
352 | 347 | :return: iterable |
|
353 | 348 | """ |
|
354 | 349 | if not end_time: |
|
355 | 350 | end_time = datetime.utcnow().replace(second=0, microsecond=0) |
|
356 | 351 | delta = end_time - stats_since |
|
357 | 352 | # if smaller than 3 days we want to group by 5min else by 1h, |
|
358 | 353 | # for 60 min group by min |
|
359 | 354 | if not gap_gen: |
|
360 | 355 | gap_gen = gap_gen_default |
|
361 | 356 | if not iv_extractor: |
|
362 | 357 | iv_extractor = default_extractor |
|
363 | 358 | |
|
364 | 359 | # do not use custom interval if total time range with new iv would exceed |
|
365 | 360 | # end time |
|
366 | 361 | if not step_interval or stats_since + step_interval >= end_time: |
|
367 | 362 | if delta < h.time_deltas.get('12h')['delta']: |
|
368 | 363 | step_interval = timedelta(seconds=60) |
|
369 | 364 | elif delta < h.time_deltas.get('3d')['delta']: |
|
370 | 365 | step_interval = timedelta(seconds=60 * 5) |
|
371 | 366 | elif delta > h.time_deltas.get('2w')['delta']: |
|
372 | 367 | step_interval = timedelta(days=1) |
|
373 | 368 | else: |
|
374 | 369 | step_interval = timedelta(minutes=60) |
|
375 | 370 | |
|
376 | 371 | if step_interval >= timedelta(minutes=60): |
|
377 | 372 | log.info('cached_buckets:{}: adjusting start time ' |
|
378 | 373 | 'for hourly or daily intervals'.format(cache_key)) |
|
379 | 374 | stats_since = stats_since.replace(hour=0, minute=0) |
|
380 | 375 | |
|
381 | 376 | ranges = [i.start_interval for i in list(gap_gen(stats_since, |
|
382 | 377 | step_interval, [], |
|
383 | 378 | end_time=end_time))] |
|
384 | 379 | buckets = {} |
|
385 | 380 | storage_key = 'buckets:' + cache_key + '{}|{}' |
|
386 | 381 | # this means we basicly cache per hour in 3-14 day intervals but i think |
|
387 | 382 | # its fine at this point - will be faster than db access anyways |
|
388 | 383 | |
|
389 | 384 | if len(ranges) >= 1: |
|
390 | 385 | last_ranges = [ranges[-1]] |
|
391 | 386 | else: |
|
392 | 387 | last_ranges = [] |
|
393 | 388 | if step_interval >= timedelta(minutes=60): |
|
394 | 389 | for r in ranges: |
|
395 | 390 | k = storage_key.format(step_interval.total_seconds(), r) |
|
396 | 391 | value = request.registry.cache_regions.redis_day_30.get(k) |
|
397 | 392 | # last buckets are never loaded from cache |
|
398 | 393 | is_last_result = ( |
|
399 | 394 | r >= end_time - timedelta(hours=6) or r in last_ranges) |
|
400 | 395 | if value is not NO_VALUE and not is_last_result: |
|
401 | 396 | log.info("cached_buckets:{}: " |
|
402 | 397 | "loading range {} from cache".format(cache_key, r)) |
|
403 | 398 | buckets[r] = value |
|
404 | 399 | else: |
|
405 | 400 | log.info("cached_buckets:{}: " |
|
406 | 401 | "loading range {} from storage".format(cache_key, r)) |
|
407 | 402 | range_size = step_interval |
|
408 | 403 | if (step_interval == timedelta(minutes=60) and |
|
409 | 404 | not is_last_result and rerange): |
|
410 | 405 | range_size = timedelta(days=1) |
|
411 | 406 | r = r.replace(hour=0, minute=0) |
|
412 | 407 | log.info("cached_buckets:{}: " |
|
413 | 408 | "loading collapsed " |
|
414 | 409 | "range {} {}".format(cache_key, r, |
|
415 | 410 | r + range_size)) |
|
416 | 411 | bucket_data = fn( |
|
417 | 412 | request, r, r + range_size, step_interval, |
|
418 | 413 | gap_gen, bucket_count=len(ranges), *args, **kwargs) |
|
419 | 414 | for b in bucket_data: |
|
420 | 415 | b_iv = iv_extractor(b) |
|
421 | 416 | buckets[b_iv] = b |
|
422 | 417 | k2 = storage_key.format( |
|
423 | 418 | step_interval.total_seconds(), b_iv) |
|
424 | 419 | request.registry.cache_regions.redis_day_30.set(k2, b) |
|
425 | 420 | log.info("cached_buckets:{}: saving cache".format(cache_key)) |
|
426 | 421 | else: |
|
427 | 422 | # bucket count is 1 for short time ranges <= 24h from now |
|
428 | 423 | bucket_data = fn(request, stats_since, end_time, step_interval, |
|
429 | 424 | gap_gen, bucket_count=1, *args, **kwargs) |
|
430 | 425 | for b in bucket_data: |
|
431 | 426 | buckets[iv_extractor(b)] = b |
|
432 | 427 | return buckets |
|
433 | 428 | |
|
434 | 429 | |
|
435 | 430 | def get_cached_split_data(request, stats_since, end_time, fn, cache_key, |
|
436 | 431 | db_session=None, *args, **kwargs): |
|
437 | 432 | """ Takes "fn" that should return some data and tries to load the data |
|
438 | 433 | dividing it into 2 buckets - cached "since_from" bucket and "today" |
|
439 | 434 | bucket - then the data can be reduced into single value |
|
440 | 435 | |
|
441 | 436 | Data is cached if the stats_since and end time give a delta bigger |
|
442 | 437 | than 24hours - then only 24h is computed on the fly |
|
443 | 438 | """ |
|
444 | 439 | if not end_time: |
|
445 | 440 | end_time = datetime.utcnow().replace(second=0, microsecond=0) |
|
446 | 441 | delta = end_time - stats_since |
|
447 | 442 | |
|
448 | 443 | if delta >= timedelta(minutes=60): |
|
449 | 444 | log.info('cached_split_data:{}: adjusting start time ' |
|
450 | 445 | 'for hourly or daily intervals'.format(cache_key)) |
|
451 | 446 | stats_since = stats_since.replace(hour=0, minute=0) |
|
452 | 447 | |
|
453 | 448 | storage_key = 'buckets_split_data:' + cache_key + ':{}|{}' |
|
454 | 449 | old_end_time = end_time.replace(hour=0, minute=0) |
|
455 | 450 | |
|
456 | 451 | final_storage_key = storage_key.format(delta.total_seconds(), |
|
457 | 452 | old_end_time) |
|
458 | 453 | older_data = None |
|
459 | 454 | |
|
460 | 455 | cdata = request.registry.cache_regions.redis_day_7.get( |
|
461 | 456 | final_storage_key) |
|
462 | 457 | |
|
463 | 458 | if cdata: |
|
464 | 459 | log.info("cached_split_data:{}: found old " |
|
465 | 460 | "bucket data".format(cache_key)) |
|
466 | 461 | older_data = cdata |
|
467 | 462 | |
|
468 | 463 | if (stats_since < end_time - h.time_deltas.get('24h')['delta'] and |
|
469 | 464 | not cdata): |
|
470 | 465 | log.info("cached_split_data:{}: didn't find the " |
|
471 | 466 | "start bucket in cache so load older data".format(cache_key)) |
|
472 | 467 | recent_stats_since = old_end_time |
|
473 | 468 | older_data = fn(request, stats_since, recent_stats_since, |
|
474 | 469 | db_session=db_session, *args, **kwargs) |
|
475 | 470 | request.registry.cache_regions.redis_day_7.set(final_storage_key, |
|
476 | 471 | older_data) |
|
477 | 472 | elif stats_since < end_time - h.time_deltas.get('24h')['delta']: |
|
478 | 473 | recent_stats_since = old_end_time |
|
479 | 474 | else: |
|
480 | 475 | recent_stats_since = stats_since |
|
481 | 476 | |
|
482 | 477 | log.info("cached_split_data:{}: loading fresh " |
|
483 | 478 | "data bucksts from last 24h ".format(cache_key)) |
|
484 | 479 | todays_data = fn(request, recent_stats_since, end_time, |
|
485 | 480 | db_session=db_session, *args, **kwargs) |
|
486 | 481 | return older_data, todays_data |
|
487 | 482 | |
|
488 | 483 | |
|
489 | 484 | def in_batches(seq, size): |
|
490 | 485 | """ |
|
491 | 486 | Splits am iterable into batches of specified size |
|
492 | 487 | :param seq (iterable) |
|
493 | 488 | :param size integer |
|
494 | 489 | """ |
|
495 | 490 | return (seq[pos:pos + size] for pos in range(0, len(seq), size)) |
@@ -1,147 +1,142 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import logging |
|
23 | 18 | import uuid |
|
24 | 19 | |
|
25 | 20 | from datetime import datetime |
|
26 | 21 | |
|
27 | 22 | log = logging.getLogger(__name__) |
|
28 | 23 | |
|
29 | 24 | |
|
30 | 25 | def parse_airbrake_xml(request): |
|
31 | 26 | root = request.context.airbrake_xml_etree |
|
32 | 27 | error = root.find('error') |
|
33 | 28 | notifier = root.find('notifier') |
|
34 | 29 | server_env = root.find('server-environment') |
|
35 | 30 | request_data = root.find('request') |
|
36 | 31 | user = root.find('current-user') |
|
37 | 32 | if request_data is not None: |
|
38 | 33 | cgi_data = request_data.find('cgi-data') |
|
39 | 34 | if cgi_data is None: |
|
40 | 35 | cgi_data = [] |
|
41 | 36 | |
|
42 | 37 | error_dict = { |
|
43 | 38 | 'class_name': error.findtext('class') or '', |
|
44 | 39 | 'error': error.findtext('message') or '', |
|
45 | 40 | "occurences": 1, |
|
46 | 41 | "http_status": 500, |
|
47 | 42 | "priority": 5, |
|
48 | 43 | "server": 'unknown', |
|
49 | 44 | 'url': 'unknown', 'request': {} |
|
50 | 45 | } |
|
51 | 46 | if user is not None: |
|
52 | 47 | error_dict['username'] = user.findtext('username') or \ |
|
53 | 48 | user.findtext('id') |
|
54 | 49 | if notifier is not None: |
|
55 | 50 | error_dict['client'] = notifier.findtext('name') |
|
56 | 51 | |
|
57 | 52 | if server_env is not None: |
|
58 | 53 | error_dict["server"] = server_env.findtext('hostname', 'unknown') |
|
59 | 54 | |
|
60 | 55 | whitelist_environ = ['REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', |
|
61 | 56 | 'CONTENT_TYPE', 'HTTP_REFERER'] |
|
62 | 57 | |
|
63 | 58 | if request_data is not None: |
|
64 | 59 | error_dict['url'] = request_data.findtext('url', 'unknown') |
|
65 | 60 | component = request_data.findtext('component') |
|
66 | 61 | action = request_data.findtext('action') |
|
67 | 62 | if component and action: |
|
68 | 63 | error_dict['view_name'] = '%s:%s' % (component, action) |
|
69 | 64 | for node in cgi_data: |
|
70 | 65 | key = node.get('key') |
|
71 | 66 | if key.startswith('HTTP') or key in whitelist_environ: |
|
72 | 67 | error_dict['request'][key] = node.text |
|
73 | 68 | elif 'query_parameters' in key: |
|
74 | 69 | error_dict['request']['GET'] = {} |
|
75 | 70 | for x in node: |
|
76 | 71 | error_dict['request']['GET'][x.get('key')] = x.text |
|
77 | 72 | elif 'request_parameters' in key: |
|
78 | 73 | error_dict['request']['POST'] = {} |
|
79 | 74 | for x in node: |
|
80 | 75 | error_dict['request']['POST'][x.get('key')] = x.text |
|
81 | 76 | elif key.endswith('cookie'): |
|
82 | 77 | error_dict['request']['COOKIE'] = {} |
|
83 | 78 | for x in node: |
|
84 | 79 | error_dict['request']['COOKIE'][x.get('key')] = x.text |
|
85 | 80 | elif key.endswith('request_id'): |
|
86 | 81 | error_dict['request_id'] = node.text |
|
87 | 82 | elif key.endswith('session'): |
|
88 | 83 | error_dict['request']['SESSION'] = {} |
|
89 | 84 | for x in node: |
|
90 | 85 | error_dict['request']['SESSION'][x.get('key')] = x.text |
|
91 | 86 | else: |
|
92 | 87 | if key in ['rack.session.options']: |
|
93 | 88 | # skip secret configs |
|
94 | 89 | continue |
|
95 | 90 | try: |
|
96 | 91 | if len(node): |
|
97 | 92 | error_dict['request'][key] = dict( |
|
98 | 93 | [(x.get('key'), x.text,) for x in node]) |
|
99 | 94 | else: |
|
100 | 95 | error_dict['request'][key] = node.text |
|
101 | 96 | except Exception as e: |
|
102 | 97 | log.warning('Airbrake integration exception: %s' % e) |
|
103 | 98 | |
|
104 | 99 | error_dict['request'].pop('HTTP_COOKIE', '') |
|
105 | 100 | |
|
106 | 101 | error_dict['ip'] = error_dict.pop('REMOTE_ADDR', '') |
|
107 | 102 | error_dict['user_agent'] = error_dict.pop('HTTP_USER_AGENT', '') |
|
108 | 103 | if 'request_id' not in error_dict: |
|
109 | 104 | error_dict['request_id'] = str(uuid.uuid4()) |
|
110 | 105 | if request.context.possibly_public: |
|
111 | 106 | # set ip for reports that come from airbrake js client |
|
112 | 107 | error_dict["timestamp"] = datetime.utcnow() |
|
113 | 108 | if request.environ.get("HTTP_X_FORWARDED_FOR"): |
|
114 | 109 | ip = request.environ.get("HTTP_X_FORWARDED_FOR", '') |
|
115 | 110 | first_ip = ip.split(',')[0] |
|
116 | 111 | remote_addr = first_ip.strip() |
|
117 | 112 | else: |
|
118 | 113 | remote_addr = (request.environ.get("HTTP_X_REAL_IP") or |
|
119 | 114 | request.environ.get('REMOTE_ADDR')) |
|
120 | 115 | error_dict["ip"] = remote_addr |
|
121 | 116 | |
|
122 | 117 | blacklist = ['password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf', |
|
123 | 118 | 'session', 'test'] |
|
124 | 119 | |
|
125 | 120 | lines = [] |
|
126 | 121 | for l in error.find('backtrace'): |
|
127 | 122 | lines.append({'file': l.get("file", ""), |
|
128 | 123 | 'line': l.get("number", ""), |
|
129 | 124 | 'fn': l.get("method", ""), |
|
130 | 125 | 'module': l.get("module", ""), |
|
131 | 126 | 'cline': l.get("method", ""), |
|
132 | 127 | 'vars': {}}) |
|
133 | 128 | error_dict['traceback'] = list(reversed(lines)) |
|
134 | 129 | # filtering is not provided by airbrake |
|
135 | 130 | keys_to_check = ( |
|
136 | 131 | error_dict['request'].get('COOKIE'), |
|
137 | 132 | error_dict['request'].get('COOKIES'), |
|
138 | 133 | error_dict['request'].get('POST'), |
|
139 | 134 | error_dict['request'].get('SESSION'), |
|
140 | 135 | ) |
|
141 | 136 | for source in [_f for _f in keys_to_check if _f]: |
|
142 | 137 | for k in source.keys(): |
|
143 | 138 | for bad_key in blacklist: |
|
144 | 139 | if bad_key in k.lower(): |
|
145 | 140 | source[k] = '***' |
|
146 | 141 | |
|
147 | 142 | return error_dict |
@@ -1,61 +1,56 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | from datetime import tzinfo, timedelta, datetime |
|
23 | 18 | from dateutil.relativedelta import relativedelta |
|
24 | 19 | import logging |
|
25 | 20 | |
|
26 | 21 | log = logging.getLogger(__name__) |
|
27 | 22 | |
|
28 | 23 | |
|
29 | 24 | def to_relativedelta(time_delta): |
|
30 | 25 | return relativedelta(seconds=int(time_delta.total_seconds()), |
|
31 | 26 | microseconds=time_delta.microseconds) |
|
32 | 27 | |
|
33 | 28 | |
|
34 | 29 | def convert_date(date_str, return_utcnow_if_wrong=True, |
|
35 | 30 | normalize_future=False): |
|
36 | 31 | utcnow = datetime.utcnow() |
|
37 | 32 | if isinstance(date_str, datetime): |
|
38 | 33 | # get rid of tzinfo |
|
39 | 34 | return date_str.replace(tzinfo=None) |
|
40 | 35 | if not date_str and return_utcnow_if_wrong: |
|
41 | 36 | return utcnow |
|
42 | 37 | try: |
|
43 | 38 | try: |
|
44 | 39 | if 'Z' in date_str: |
|
45 | 40 | date_str = date_str[:date_str.index('Z')] |
|
46 | 41 | if '.' in date_str: |
|
47 | 42 | date = datetime.strptime(date_str, '%Y-%m-%dT%H:%M:%S.%f') |
|
48 | 43 | else: |
|
49 | 44 | date = datetime.strptime(date_str, '%Y-%m-%dT%H:%M:%S') |
|
50 | 45 | except Exception: |
|
51 | 46 | # bw compat with old client |
|
52 | 47 | date = datetime.strptime(date_str, '%Y-%m-%d %H:%M:%S,%f') |
|
53 | 48 | except Exception: |
|
54 | 49 | if return_utcnow_if_wrong: |
|
55 | 50 | date = utcnow |
|
56 | 51 | else: |
|
57 | 52 | date = None |
|
58 | 53 | if normalize_future and date and date > (utcnow + timedelta(minutes=3)): |
|
59 | 54 | log.warning('time %s in future + 3 min, normalizing' % date) |
|
60 | 55 | return utcnow |
|
61 | 56 | return date |
@@ -1,301 +1,296 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | from datetime import timedelta |
|
23 | 18 | |
|
24 | 19 | from appenlight.lib.enums import LogLevelPython, ParsedSentryEventType |
|
25 | 20 | |
|
26 | 21 | EXCLUDED_LOG_VARS = [ |
|
27 | 22 | 'args', 'asctime', 'created', 'exc_info', 'exc_text', 'filename', |
|
28 | 23 | 'funcName', 'levelname', 'levelno', 'lineno', 'message', 'module', 'msecs', |
|
29 | 24 | 'msg', 'name', 'pathname', 'process', 'processName', 'relativeCreated', |
|
30 | 25 | 'thread', 'threadName'] |
|
31 | 26 | |
|
32 | 27 | EXCLUDE_SENTRY_KEYS = [ |
|
33 | 28 | 'csp', |
|
34 | 29 | 'culprit', |
|
35 | 30 | 'event_id', |
|
36 | 31 | 'exception', |
|
37 | 32 | 'extra', |
|
38 | 33 | 'level', |
|
39 | 34 | 'logentry', |
|
40 | 35 | 'logger', |
|
41 | 36 | 'message', |
|
42 | 37 | 'modules', |
|
43 | 38 | 'platform', |
|
44 | 39 | 'query', |
|
45 | 40 | 'release', |
|
46 | 41 | 'request', |
|
47 | 42 | 'sentry.interfaces.Csp', 'sentry.interfaces.Exception', |
|
48 | 43 | 'sentry.interfaces.Http', 'sentry.interfaces.Message', |
|
49 | 44 | 'sentry.interfaces.Query', |
|
50 | 45 | 'sentry.interfaces.Stacktrace', |
|
51 | 46 | 'sentry.interfaces.Template', 'sentry.interfaces.User', |
|
52 | 47 | 'sentry.interfaces.csp.Csp', |
|
53 | 48 | 'sentry.interfaces.exception.Exception', |
|
54 | 49 | 'sentry.interfaces.http.Http', |
|
55 | 50 | 'sentry.interfaces.message.Message', |
|
56 | 51 | 'sentry.interfaces.query.Query', |
|
57 | 52 | 'sentry.interfaces.stacktrace.Stacktrace', |
|
58 | 53 | 'sentry.interfaces.template.Template', |
|
59 | 54 | 'sentry.interfaces.user.User', 'server_name', |
|
60 | 55 | 'stacktrace', |
|
61 | 56 | 'tags', |
|
62 | 57 | 'template', |
|
63 | 58 | 'time_spent', |
|
64 | 59 | 'timestamp', |
|
65 | 60 | 'user'] |
|
66 | 61 | |
|
67 | 62 | |
|
68 | 63 | def get_keys(list_of_keys, json_body): |
|
69 | 64 | for k in list_of_keys: |
|
70 | 65 | if k in json_body: |
|
71 | 66 | return json_body[k] |
|
72 | 67 | |
|
73 | 68 | |
|
74 | 69 | def get_logentry(json_body): |
|
75 | 70 | key_names = ['logentry', |
|
76 | 71 | 'sentry.interfaces.message.Message', |
|
77 | 72 | 'sentry.interfaces.Message' |
|
78 | 73 | ] |
|
79 | 74 | logentry = get_keys(key_names, json_body) |
|
80 | 75 | return logentry |
|
81 | 76 | |
|
82 | 77 | |
|
83 | 78 | def get_exception(json_body): |
|
84 | 79 | parsed_exception = {} |
|
85 | 80 | key_names = ['exception', |
|
86 | 81 | 'sentry.interfaces.exception.Exception', |
|
87 | 82 | 'sentry.interfaces.Exception' |
|
88 | 83 | ] |
|
89 | 84 | exception = get_keys(key_names, json_body) or {} |
|
90 | 85 | if exception: |
|
91 | 86 | if isinstance(exception, dict): |
|
92 | 87 | exception = exception['values'][0] |
|
93 | 88 | else: |
|
94 | 89 | exception = exception[0] |
|
95 | 90 | |
|
96 | 91 | parsed_exception['type'] = exception.get('type') |
|
97 | 92 | parsed_exception['value'] = exception.get('value') |
|
98 | 93 | parsed_exception['module'] = exception.get('module') |
|
99 | 94 | parsed_stacktrace = get_stacktrace(exception) or {} |
|
100 | 95 | parsed_exception = exception or {} |
|
101 | 96 | return parsed_exception, parsed_stacktrace |
|
102 | 97 | |
|
103 | 98 | |
|
104 | 99 | def get_stacktrace(json_body): |
|
105 | 100 | parsed_stacktrace = [] |
|
106 | 101 | key_names = ['stacktrace', |
|
107 | 102 | 'sentry.interfaces.stacktrace.Stacktrace', |
|
108 | 103 | 'sentry.interfaces.Stacktrace' |
|
109 | 104 | ] |
|
110 | 105 | stacktrace = get_keys(key_names, json_body) |
|
111 | 106 | if stacktrace: |
|
112 | 107 | for frame in stacktrace['frames']: |
|
113 | 108 | parsed_stacktrace.append( |
|
114 | 109 | {"cline": frame.get('context_line', ''), |
|
115 | 110 | "file": frame.get('filename', ''), |
|
116 | 111 | "module": frame.get('module', ''), |
|
117 | 112 | "fn": frame.get('function', ''), |
|
118 | 113 | "line": frame.get('lineno', ''), |
|
119 | 114 | "vars": list(frame.get('vars', {}).items()) |
|
120 | 115 | } |
|
121 | 116 | ) |
|
122 | 117 | return parsed_stacktrace |
|
123 | 118 | |
|
124 | 119 | |
|
125 | 120 | def get_template(json_body): |
|
126 | 121 | parsed_template = {} |
|
127 | 122 | key_names = ['template', |
|
128 | 123 | 'sentry.interfaces.template.Template', |
|
129 | 124 | 'sentry.interfaces.Template' |
|
130 | 125 | ] |
|
131 | 126 | template = get_keys(key_names, json_body) |
|
132 | 127 | if template: |
|
133 | 128 | for frame in template['frames']: |
|
134 | 129 | parsed_template.append( |
|
135 | 130 | {"cline": frame.get('context_line', ''), |
|
136 | 131 | "file": frame.get('filename', ''), |
|
137 | 132 | "fn": '', |
|
138 | 133 | "line": frame.get('lineno', ''), |
|
139 | 134 | "vars": [] |
|
140 | 135 | } |
|
141 | 136 | ) |
|
142 | 137 | |
|
143 | 138 | return parsed_template |
|
144 | 139 | |
|
145 | 140 | |
|
146 | 141 | def get_request(json_body): |
|
147 | 142 | parsed_http = {} |
|
148 | 143 | key_names = ['request', |
|
149 | 144 | 'sentry.interfaces.http.Http', |
|
150 | 145 | 'sentry.interfaces.Http' |
|
151 | 146 | ] |
|
152 | 147 | http = get_keys(key_names, json_body) or {} |
|
153 | 148 | for k, v in http.items(): |
|
154 | 149 | if k == 'headers': |
|
155 | 150 | parsed_http['headers'] = {} |
|
156 | 151 | for sk, sv in http['headers'].items(): |
|
157 | 152 | parsed_http['headers'][sk.title()] = sv |
|
158 | 153 | else: |
|
159 | 154 | parsed_http[k.lower()] = v |
|
160 | 155 | return parsed_http |
|
161 | 156 | |
|
162 | 157 | |
|
163 | 158 | def get_user(json_body): |
|
164 | 159 | parsed_user = {} |
|
165 | 160 | key_names = ['user', |
|
166 | 161 | 'sentry.interfaces.user.User', |
|
167 | 162 | 'sentry.interfaces.User' |
|
168 | 163 | ] |
|
169 | 164 | user = get_keys(key_names, json_body) |
|
170 | 165 | if user: |
|
171 | 166 | parsed_user['id'] = user.get('id') |
|
172 | 167 | parsed_user['username'] = user.get('username') |
|
173 | 168 | parsed_user['email'] = user.get('email') |
|
174 | 169 | parsed_user['ip_address'] = user.get('ip_address') |
|
175 | 170 | |
|
176 | 171 | return parsed_user |
|
177 | 172 | |
|
178 | 173 | |
|
179 | 174 | def get_query(json_body): |
|
180 | 175 | query = None |
|
181 | 176 | key_name = ['query', |
|
182 | 177 | 'sentry.interfaces.query.Query', |
|
183 | 178 | 'sentry.interfaces.Query' |
|
184 | 179 | ] |
|
185 | 180 | query = get_keys(key_name, json_body) |
|
186 | 181 | return query |
|
187 | 182 | |
|
188 | 183 | |
|
189 | 184 | def parse_sentry_event(json_body): |
|
190 | 185 | request_id = json_body.get('event_id') |
|
191 | 186 | |
|
192 | 187 | # required |
|
193 | 188 | message = json_body.get('message') |
|
194 | 189 | log_timestamp = json_body.get('timestamp') |
|
195 | 190 | level = json_body.get('level') |
|
196 | 191 | if isinstance(level, int): |
|
197 | 192 | level = LogLevelPython.key_from_value(level) |
|
198 | 193 | |
|
199 | 194 | namespace = json_body.get('logger') |
|
200 | 195 | language = json_body.get('platform') |
|
201 | 196 | |
|
202 | 197 | # optional |
|
203 | 198 | server_name = json_body.get('server_name') |
|
204 | 199 | culprit = json_body.get('culprit') |
|
205 | 200 | release = json_body.get('release') |
|
206 | 201 | |
|
207 | 202 | tags = json_body.get('tags', {}) |
|
208 | 203 | if hasattr(tags, 'items'): |
|
209 | 204 | tags = list(tags.items()) |
|
210 | 205 | extra = json_body.get('extra', {}) |
|
211 | 206 | if hasattr(extra, 'items'): |
|
212 | 207 | extra = list(extra.items()) |
|
213 | 208 | |
|
214 | 209 | parsed_req = get_request(json_body) |
|
215 | 210 | user = get_user(json_body) |
|
216 | 211 | template = get_template(json_body) |
|
217 | 212 | query = get_query(json_body) |
|
218 | 213 | |
|
219 | 214 | # other unidentified keys found |
|
220 | 215 | other_keys = [(k, json_body[k]) for k in json_body.keys() |
|
221 | 216 | if k not in EXCLUDE_SENTRY_KEYS] |
|
222 | 217 | |
|
223 | 218 | logentry = get_logentry(json_body) |
|
224 | 219 | if logentry: |
|
225 | 220 | message = logentry['message'] |
|
226 | 221 | |
|
227 | 222 | exception, stacktrace = get_exception(json_body) |
|
228 | 223 | |
|
229 | 224 | alt_stacktrace = get_stacktrace(json_body) |
|
230 | 225 | event_type = None |
|
231 | 226 | if not exception and not stacktrace and not alt_stacktrace and not template: |
|
232 | 227 | event_type = ParsedSentryEventType.LOG |
|
233 | 228 | |
|
234 | 229 | event_dict = { |
|
235 | 230 | 'log_level': level, |
|
236 | 231 | 'message': message, |
|
237 | 232 | 'namespace': namespace, |
|
238 | 233 | 'request_id': request_id, |
|
239 | 234 | 'server': server_name, |
|
240 | 235 | 'date': log_timestamp, |
|
241 | 236 | 'tags': tags |
|
242 | 237 | } |
|
243 | 238 | event_dict['tags'].extend( |
|
244 | 239 | [(k, v) for k, v in extra if k not in EXCLUDED_LOG_VARS]) |
|
245 | 240 | |
|
246 | 241 | # other keys can be various object types |
|
247 | 242 | event_dict['tags'].extend([(k, v) for k, v in other_keys |
|
248 | 243 | if isinstance(v, str)]) |
|
249 | 244 | if culprit: |
|
250 | 245 | event_dict['tags'].append(('sentry_culprit', culprit)) |
|
251 | 246 | if language: |
|
252 | 247 | event_dict['tags'].append(('sentry_language', language)) |
|
253 | 248 | if release: |
|
254 | 249 | event_dict['tags'].append(('sentry_release', release)) |
|
255 | 250 | |
|
256 | 251 | if exception or stacktrace or alt_stacktrace or template: |
|
257 | 252 | event_type = ParsedSentryEventType.ERROR_REPORT |
|
258 | 253 | event_dict = { |
|
259 | 254 | 'client': 'sentry', |
|
260 | 255 | 'error': message, |
|
261 | 256 | 'namespace': namespace, |
|
262 | 257 | 'request_id': request_id, |
|
263 | 258 | 'server': server_name, |
|
264 | 259 | 'start_time': log_timestamp, |
|
265 | 260 | 'end_time': None, |
|
266 | 261 | 'tags': tags, |
|
267 | 262 | 'extra': extra, |
|
268 | 263 | 'language': language, |
|
269 | 264 | 'view_name': json_body.get('culprit'), |
|
270 | 265 | 'http_status': None, |
|
271 | 266 | 'username': None, |
|
272 | 267 | 'url': parsed_req.get('url'), |
|
273 | 268 | 'ip': None, |
|
274 | 269 | 'user_agent': None, |
|
275 | 270 | 'request': None, |
|
276 | 271 | 'slow_calls': None, |
|
277 | 272 | 'request_stats': None, |
|
278 | 273 | 'traceback': None |
|
279 | 274 | } |
|
280 | 275 | |
|
281 | 276 | event_dict['extra'].extend(other_keys) |
|
282 | 277 | if release: |
|
283 | 278 | event_dict['tags'].append(('sentry_release', release)) |
|
284 | 279 | event_dict['request'] = parsed_req |
|
285 | 280 | if 'headers' in parsed_req: |
|
286 | 281 | event_dict['user_agent'] = parsed_req['headers'].get('User-Agent') |
|
287 | 282 | if 'env' in parsed_req: |
|
288 | 283 | event_dict['ip'] = parsed_req['env'].get('REMOTE_ADDR') |
|
289 | 284 | ts_ms = int(json_body.get('time_spent') or 0) |
|
290 | 285 | if ts_ms > 0: |
|
291 | 286 | event_dict['end_time'] = event_dict['start_time'] + \ |
|
292 | 287 | timedelta(milliseconds=ts_ms) |
|
293 | 288 | if stacktrace or alt_stacktrace or template: |
|
294 | 289 | event_dict['traceback'] = stacktrace or alt_stacktrace or template |
|
295 | 290 | for k in list(event_dict.keys()): |
|
296 | 291 | if event_dict[k] is None: |
|
297 | 292 | del event_dict[k] |
|
298 | 293 | if user: |
|
299 | 294 | event_dict['username'] = user['username'] or user['id'] \ |
|
300 | 295 | or user['email'] |
|
301 | 296 | return event_dict, event_type |
@@ -1,22 +1,17 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 |
@@ -1,103 +1,98 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | from alembic import context |
|
23 | 18 | from sqlalchemy import engine_from_config, pool, MetaData |
|
24 | 19 | from logging.config import fileConfig |
|
25 | 20 | from appenlight.models import NAMING_CONVENTION |
|
26 | 21 | |
|
27 | 22 | # this is the Alembic Config object, which provides |
|
28 | 23 | # access to the values within the .ini file in use. |
|
29 | 24 | config = context.config |
|
30 | 25 | |
|
31 | 26 | # Interpret the config file for Python logging. |
|
32 | 27 | # This line sets up loggers basically. |
|
33 | 28 | if config.config_file_name: |
|
34 | 29 | fileConfig(config.config_file_name) |
|
35 | 30 | |
|
36 | 31 | # add your model's MetaData object here |
|
37 | 32 | # for 'autogenerate' support |
|
38 | 33 | # from myapp import mymodel |
|
39 | 34 | # target_metadata = mymodel.Base.metadata |
|
40 | 35 | |
|
41 | 36 | |
|
42 | 37 | target_metadata = MetaData(naming_convention=NAMING_CONVENTION) |
|
43 | 38 | |
|
44 | 39 | # other values from the config, defined by the needs of env.py, |
|
45 | 40 | # can be acquired: |
|
46 | 41 | # my_important_option = config.get_main_option("my_important_option") |
|
47 | 42 | # ... etc. |
|
48 | 43 | |
|
49 | 44 | VERSION_TABLE_NAME = 'alembic_appenlight_version' |
|
50 | 45 | |
|
51 | 46 | |
|
52 | 47 | def run_migrations_offline(): |
|
53 | 48 | """Run migrations in 'offline' mode. |
|
54 | 49 | |
|
55 | 50 | This configures the context with just a URL |
|
56 | 51 | and not an Engine, though an Engine is acceptable |
|
57 | 52 | here as well. By skipping the Engine creation |
|
58 | 53 | we don't even need a DBAPI to be available. |
|
59 | 54 | |
|
60 | 55 | Calls to context.execute() here emit the given string to the |
|
61 | 56 | script output. |
|
62 | 57 | |
|
63 | 58 | """ |
|
64 | 59 | url = config.get_main_option("sqlalchemy.url") |
|
65 | 60 | context.configure(url=url, target_metadata=target_metadata, |
|
66 | 61 | transaction_per_migration=True, |
|
67 | 62 | version_table=VERSION_TABLE_NAME) |
|
68 | 63 | |
|
69 | 64 | with context.begin_transaction(): |
|
70 | 65 | context.run_migrations() |
|
71 | 66 | |
|
72 | 67 | |
|
73 | 68 | def run_migrations_online(): |
|
74 | 69 | """Run migrations in 'online' mode. |
|
75 | 70 | |
|
76 | 71 | In this scenario we need to create an Engine |
|
77 | 72 | and associate a connection with the context. |
|
78 | 73 | |
|
79 | 74 | """ |
|
80 | 75 | engine = engine_from_config( |
|
81 | 76 | config.get_section(config.config_ini_section), |
|
82 | 77 | prefix='sqlalchemy.', |
|
83 | 78 | poolclass=pool.NullPool) |
|
84 | 79 | |
|
85 | 80 | connection = engine.connect() |
|
86 | 81 | context.configure( |
|
87 | 82 | connection=connection, |
|
88 | 83 | target_metadata=target_metadata, |
|
89 | 84 | transaction_per_migration=True, |
|
90 | 85 | version_table=VERSION_TABLE_NAME |
|
91 | 86 | ) |
|
92 | 87 | |
|
93 | 88 | try: |
|
94 | 89 | with context.begin_transaction(): |
|
95 | 90 | context.run_migrations() |
|
96 | 91 | finally: |
|
97 | 92 | connection.close() |
|
98 | 93 | |
|
99 | 94 | |
|
100 | 95 | if context.is_offline_mode(): |
|
101 | 96 | run_migrations_offline() |
|
102 | 97 | else: |
|
103 | 98 | run_migrations_online() |
@@ -1,629 +1,624 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | """initial tables |
|
23 | 18 | |
|
24 | 19 | Revision ID: 55b6e612672f |
|
25 | 20 | Revises: None |
|
26 | 21 | Create Date: 2014-10-13 23:47:38.295159 |
|
27 | 22 | |
|
28 | 23 | """ |
|
29 | 24 | |
|
30 | 25 | # revision identifiers, used by Alembic. |
|
31 | 26 | revision = '55b6e612672f' |
|
32 | 27 | down_revision = None |
|
33 | 28 | |
|
34 | 29 | from alembic import op |
|
35 | 30 | import sqlalchemy as sa |
|
36 | 31 | |
|
37 | 32 | |
|
38 | 33 | def upgrade(): |
|
39 | 34 | op.add_column('users', sa.Column('first_name', sa.Unicode(25))) |
|
40 | 35 | op.add_column('users', sa.Column('last_name', sa.Unicode(50))) |
|
41 | 36 | op.add_column('users', sa.Column('company_name', sa.Unicode(255))) |
|
42 | 37 | op.add_column('users', sa.Column('company_address', sa.Unicode(255))) |
|
43 | 38 | op.add_column('users', sa.Column('phone1', sa.Unicode(25))) |
|
44 | 39 | op.add_column('users', sa.Column('phone2', sa.Unicode(25))) |
|
45 | 40 | op.add_column('users', sa.Column('zip_code', sa.Unicode(25))) |
|
46 | 41 | op.add_column('users', sa.Column('default_report_sort', sa.Unicode(20), nullable=False, server_default="newest")) |
|
47 | 42 | op.add_column('users', sa.Column('city', sa.Unicode(128))) |
|
48 | 43 | op.add_column('users', sa.Column('notes', sa.UnicodeText, server_default='')) |
|
49 | 44 | op.add_column('users', sa.Column('notifications', sa.Boolean(), nullable=False, server_default='true')) |
|
50 | 45 | op.add_column('users', sa.Column('registration_ip', sa.Unicode(40), nullable=False, server_default='')) |
|
51 | 46 | |
|
52 | 47 | op.create_table( |
|
53 | 48 | 'integrations', |
|
54 | 49 | sa.Column('id', sa.Integer(), primary_key=True), |
|
55 | 50 | sa.Column('resource_id', sa.Integer(), |
|
56 | 51 | sa.ForeignKey('resources.resource_id', onupdate='cascade', |
|
57 | 52 | ondelete='cascade')), |
|
58 | 53 | sa.Column('integration_name', sa.Unicode(64)), |
|
59 | 54 | sa.Column('config', sa.dialects.postgresql.JSON, nullable=False), |
|
60 | 55 | sa.Column('modified_date', sa.DateTime(), nullable=False, server_default=sa.func.now()), |
|
61 | 56 | sa.Column('external_id', sa.Unicode(255)), |
|
62 | 57 | sa.Column('external_id2', sa.Unicode(255)) |
|
63 | 58 | ) |
|
64 | 59 | |
|
65 | 60 | op.create_table( |
|
66 | 61 | 'alert_channels', |
|
67 | 62 | sa.Column('owner_id', sa.Integer(), |
|
68 | 63 | sa.ForeignKey('users.id', onupdate='cascade', |
|
69 | 64 | ondelete='cascade'), nullable=False), |
|
70 | 65 | sa.Column('channel_name', sa.Unicode(25), nullable=False), |
|
71 | 66 | sa.Column('channel_value', sa.Unicode(80), nullable=False), |
|
72 | 67 | sa.Column('channel_json_conf', sa.dialects.postgresql.JSON, nullable=False), |
|
73 | 68 | sa.Column('channel_validated', sa.Boolean, nullable=False, server_default='False'), |
|
74 | 69 | sa.Column('send_alerts', sa.Boolean, nullable=False, server_default='True'), |
|
75 | 70 | sa.Column('notify_only_first', sa.Boolean, nullable=False, server_default='False'), |
|
76 | 71 | sa.Column('daily_digest', sa.Boolean, nullable=False, server_default='True'), |
|
77 | 72 | sa.Column('pkey', sa.Integer(), primary_key=True), |
|
78 | 73 | sa.Column('integration_id', sa.Integer, |
|
79 | 74 | sa.ForeignKey('integrations.id', onupdate='cascade', |
|
80 | 75 | ondelete='cascade')), |
|
81 | 76 | ) |
|
82 | 77 | op.create_unique_constraint('uq_alert_channels', 'alert_channels', |
|
83 | 78 | ["owner_id", "channel_name", "channel_value"]) |
|
84 | 79 | |
|
85 | 80 | op.create_table( |
|
86 | 81 | 'alert_channels_actions', |
|
87 | 82 | sa.Column('owner_id', sa.Integer(), nullable=False), |
|
88 | 83 | sa.Column('resource_id', sa.Integer(), |
|
89 | 84 | sa.ForeignKey('resources.resource_id', onupdate='cascade', |
|
90 | 85 | ondelete='cascade')), |
|
91 | 86 | sa.Column('pkey', sa.Integer(), primary_key=True), |
|
92 | 87 | sa.Column('action', sa.Unicode(10), nullable=False, server_default='always'), |
|
93 | 88 | sa.Column('rule', sa.dialects.postgresql.JSON), |
|
94 | 89 | sa.Column('type', sa.Unicode(10), index=True), |
|
95 | 90 | sa.Column('other_id', sa.Unicode(40), index=True), |
|
96 | 91 | sa.Column('config', sa.dialects.postgresql.JSON), |
|
97 | 92 | sa.Column('name', sa.Unicode(255), server_default='') |
|
98 | 93 | ) |
|
99 | 94 | |
|
100 | 95 | |
|
101 | 96 | op.create_table( |
|
102 | 97 | 'application_postprocess_conf', |
|
103 | 98 | sa.Column('pkey', sa.Integer(), primary_key=True), |
|
104 | 99 | sa.Column('do', sa.Unicode(25), nullable=False), |
|
105 | 100 | sa.Column('new_value', sa.UnicodeText(), nullable=False, server_default=''), |
|
106 | 101 | sa.Column('resource_id', sa.Integer(), |
|
107 | 102 | sa.ForeignKey('resources.resource_id', |
|
108 | 103 | onupdate='cascade', |
|
109 | 104 | ondelete='cascade'), nullable=False), |
|
110 | 105 | sa.Column('rule', sa.dialects.postgresql.JSON), |
|
111 | 106 | ) |
|
112 | 107 | |
|
113 | 108 | op.create_table( |
|
114 | 109 | 'applications', |
|
115 | 110 | sa.Column('resource_id', sa.Integer(), |
|
116 | 111 | sa.ForeignKey('resources.resource_id', onupdate='cascade', |
|
117 | 112 | ondelete='cascade'), nullable=False, |
|
118 | 113 | primary_key=True, autoincrement=False), |
|
119 | 114 | sa.Column('domains', sa.UnicodeText, nullable=False), |
|
120 | 115 | sa.Column('api_key', sa.Unicode(32), nullable=False, index=True), |
|
121 | 116 | sa.Column('default_grouping', sa.Unicode(20), nullable=False, server_default='url_type'), |
|
122 | 117 | sa.Column('public_key', sa.Unicode(32), nullable=False, index=True), |
|
123 | 118 | sa.Column('error_report_threshold', sa.Integer(), server_default='10', nullable=False), |
|
124 | 119 | sa.Column('slow_report_threshold', sa.Integer(), server_default='10', nullable=False), |
|
125 | 120 | sa.Column('apdex_threshold', sa.Float(), server_default='0.7', nullable=False), |
|
126 | 121 | sa.Column('allow_permanent_storage', sa.Boolean(), server_default="false", nullable=False), |
|
127 | 122 | ) |
|
128 | 123 | op.create_unique_constraint(None, 'applications', |
|
129 | 124 | ["public_key"]) |
|
130 | 125 | op.create_unique_constraint(None, 'applications', |
|
131 | 126 | ["api_key"]) |
|
132 | 127 | |
|
133 | 128 | op.create_table( |
|
134 | 129 | 'metrics', |
|
135 | 130 | sa.Column('pkey', sa.types.BigInteger, nullable=False, primary_key=True), |
|
136 | 131 | sa.Column('resource_id', sa.Integer(), |
|
137 | 132 | sa.ForeignKey('resources.resource_id', |
|
138 | 133 | onupdate='cascade', |
|
139 | 134 | ondelete='cascade')), |
|
140 | 135 | sa.Column('timestamp', sa.DateTime), |
|
141 | 136 | sa.Column('namespace', sa.Unicode(255)), |
|
142 | 137 | sa.Column('tags', sa.dialects.postgresql.JSON, server_default="{}") |
|
143 | 138 | ) |
|
144 | 139 | |
|
145 | 140 | op.create_table( |
|
146 | 141 | 'events', |
|
147 | 142 | sa.Column('id', sa.Integer, nullable=False, primary_key=True), |
|
148 | 143 | sa.Column('start_date', sa.DateTime, nullable=False, index=True), |
|
149 | 144 | sa.Column('end_date', sa.DateTime), |
|
150 | 145 | sa.Column('status', sa.Integer(), nullable=False, index=True), |
|
151 | 146 | sa.Column('event_type', sa.Integer(), nullable=False, index=True), |
|
152 | 147 | sa.Column('origin_user_id', sa.Integer()), |
|
153 | 148 | sa.Column('target_user_id', sa.Integer()), |
|
154 | 149 | sa.Column('resource_id', sa.Integer(), index=True), |
|
155 | 150 | sa.Column('text', sa.UnicodeText, server_default=''), |
|
156 | 151 | sa.Column('values', sa.dialects.postgresql.JSON), |
|
157 | 152 | sa.Column('target_id', sa.Integer()), |
|
158 | 153 | sa.Column('target_uuid', sa.Unicode(40), index=True) |
|
159 | 154 | ) |
|
160 | 155 | |
|
161 | 156 | op.create_table( |
|
162 | 157 | 'logs', |
|
163 | 158 | sa.Column('log_id', sa.types.BigInteger, nullable=False, primary_key=True), |
|
164 | 159 | sa.Column('resource_id', sa.Integer(), |
|
165 | 160 | sa.ForeignKey('resources.resource_id', |
|
166 | 161 | onupdate='cascade', |
|
167 | 162 | ondelete='cascade')), |
|
168 | 163 | sa.Column('log_level', sa.SmallInteger(), nullable=False), |
|
169 | 164 | sa.Column('primary_key', sa.Unicode(128), nullable=True), |
|
170 | 165 | sa.Column('message', sa.UnicodeText, nullable=False, server_default=''), |
|
171 | 166 | sa.Column('timestamp', sa.DateTime), |
|
172 | 167 | sa.Column('namespace', sa.Unicode(255)), |
|
173 | 168 | sa.Column('request_id', sa.Unicode(40)), |
|
174 | 169 | sa.Column('tags', sa.dialects.postgresql.JSON, server_default="{}"), |
|
175 | 170 | sa.Column('permanent', sa.Boolean(), server_default="false", |
|
176 | 171 | nullable=False) |
|
177 | 172 | ) |
|
178 | 173 | |
|
179 | 174 | op.create_table( |
|
180 | 175 | 'reports_groups', |
|
181 | 176 | sa.Column('id', sa.types.BigInteger, primary_key=True), |
|
182 | 177 | sa.Column('resource_id', sa.Integer, |
|
183 | 178 | sa.ForeignKey('resources.resource_id', onupdate='cascade', |
|
184 | 179 | ondelete='cascade'), nullable=False), |
|
185 | 180 | sa.Column('priority', sa.Integer, nullable=False, server_default="5"), |
|
186 | 181 | sa.Column('first_timestamp', sa.DateTime(), nullable=False, server_default=sa.func.now()), |
|
187 | 182 | sa.Column('last_timestamp', sa.DateTime()), |
|
188 | 183 | sa.Column('error', sa.UnicodeText, nullable=False, server_default=""), |
|
189 | 184 | sa.Column('grouping_hash', sa.Unicode(40), nullable=False, server_default=""), |
|
190 | 185 | sa.Column('triggered_postprocesses_ids', sa.dialects.postgresql.JSON, nullable=False, server_default="[]"), |
|
191 | 186 | sa.Column('report_type', sa.Integer, nullable=False, server_default="0"), |
|
192 | 187 | sa.Column('total_reports', sa.Integer, nullable=False, server_default="0"), |
|
193 | 188 | sa.Column('last_report', sa.Integer, nullable=False, server_default="0"), |
|
194 | 189 | sa.Column('occurences', sa.Integer, nullable=False, server_default="1"), |
|
195 | 190 | sa.Column('average_duration', sa.Float(), nullable=False, server_default="0"), |
|
196 | 191 | sa.Column('summed_duration', sa.Float(), nullable=False, server_default="0"), |
|
197 | 192 | sa.Column('notified', sa.Boolean, nullable=False, server_default="False"), |
|
198 | 193 | sa.Column('fixed', sa.Boolean, nullable=False, server_default="False"), |
|
199 | 194 | sa.Column('public', sa.Boolean, nullable=False, server_default="False"), |
|
200 | 195 | sa.Column('read', sa.Boolean, nullable=False, server_default="False"), |
|
201 | 196 | ) |
|
202 | 197 | |
|
203 | 198 | op.create_table( |
|
204 | 199 | 'reports', |
|
205 | 200 | sa.Column('id', sa.types.BigInteger, primary_key=True), |
|
206 | 201 | sa.Column('group_id', sa.types.BigInteger, |
|
207 | 202 | sa.ForeignKey('reports_groups.id', onupdate='cascade', |
|
208 | 203 | ondelete='cascade'), nullable=False, index=True), |
|
209 | 204 | sa.Column('resource_id', sa.Integer, nullable=False, index=True), |
|
210 | 205 | sa.Column('report_type', sa.Integer, nullable=False, server_default="0"), |
|
211 | 206 | sa.Column('error', sa.UnicodeText, nullable=False, server_default=""), |
|
212 | 207 | sa.Column('extra', sa.dialects.postgresql.JSON, nullable=False, server_default="{}"), |
|
213 | 208 | sa.Column('request', sa.dialects.postgresql.JSON, nullable=False, server_default="{}"), |
|
214 | 209 | sa.Column('tags', sa.dialects.postgresql.JSON, nullable=False, server_default="{}"), |
|
215 | 210 | sa.Column('ip', sa.Unicode(39), nullable=False, server_default=""), |
|
216 | 211 | sa.Column('username', sa.Unicode(255), nullable=False, server_default=""), |
|
217 | 212 | sa.Column('user_agent', sa.Unicode(512), nullable=False, server_default=""), |
|
218 | 213 | sa.Column('url', sa.UnicodeText, nullable=False, server_default=""), |
|
219 | 214 | sa.Column('request_id', sa.Unicode(40), nullable=False, server_default=""), |
|
220 | 215 | sa.Column('request_stats', sa.dialects.postgresql.JSON, nullable=False, server_default="{}"), |
|
221 | 216 | sa.Column('traceback', sa.dialects.postgresql.JSON, nullable=False, server_default="{}"), |
|
222 | 217 | sa.Column('traceback_hash', sa.Unicode(40), nullable=False, server_default=""), |
|
223 | 218 | sa.Column('start_time', sa.DateTime(), nullable=False, server_default=sa.func.now()), |
|
224 | 219 | sa.Column('end_time', sa.DateTime()), |
|
225 | 220 | sa.Column('report_group_time', sa.DateTime, index=True, nullable=False, server_default=sa.func.now()), |
|
226 | 221 | sa.Column('duration', sa.Float(), nullable=False, server_default="0"), |
|
227 | 222 | sa.Column('http_status', sa.Integer, index=True), |
|
228 | 223 | sa.Column('url_domain', sa.Unicode(128)), |
|
229 | 224 | sa.Column('url_path', sa.UnicodeText), |
|
230 | 225 | sa.Column('language', sa.Integer, server_default="0"), |
|
231 | 226 | ) |
|
232 | 227 | op.create_index(None, 'reports', |
|
233 | 228 | [sa.text("(tags ->> 'server_name')")]) |
|
234 | 229 | op.create_index(None, 'reports', |
|
235 | 230 | [sa.text("(tags ->> 'view_name')")]) |
|
236 | 231 | |
|
237 | 232 | op.create_table( |
|
238 | 233 | 'reports_assignments', |
|
239 | 234 | sa.Column('group_id', sa.types.BigInteger, nullable=False, primary_key=True), |
|
240 | 235 | sa.Column('owner_id', sa.Integer, |
|
241 | 236 | sa.ForeignKey('users.id', onupdate='cascade',ondelete='cascade'), |
|
242 | 237 | nullable=False, primary_key=True), |
|
243 | 238 | sa.Column('report_time', sa.DateTime, nullable=False) |
|
244 | 239 | ) |
|
245 | 240 | |
|
246 | 241 | op.create_table( |
|
247 | 242 | 'reports_comments', |
|
248 | 243 | sa.Column('comment_id', sa.Integer, primary_key=True), |
|
249 | 244 | sa.Column('body', sa.UnicodeText, nullable=False, server_default=''), |
|
250 | 245 | sa.Column('owner_id', sa.Integer, |
|
251 | 246 | sa.ForeignKey('users.id', onupdate='cascade', |
|
252 | 247 | ondelete='set null'), nullable=True), |
|
253 | 248 | sa.Column('created_timestamp', sa.DateTime, nullable=False, server_default=sa.func.now()), |
|
254 | 249 | sa.Column('report_time', sa.DateTime, nullable=False), |
|
255 | 250 | sa.Column('group_id', sa.types.BigInteger, nullable=False) |
|
256 | 251 | ) |
|
257 | 252 | |
|
258 | 253 | op.create_table( |
|
259 | 254 | 'reports_stats', |
|
260 | 255 | sa.Column('resource_id', sa.Integer, nullable=False, index=True), |
|
261 | 256 | sa.Column('start_interval', sa.DateTime, nullable=False, index=True), |
|
262 | 257 | sa.Column('group_id', sa.types.BigInteger, index=True), |
|
263 | 258 | sa.Column('occurences', sa.Integer, nullable=False, server_default='0', index=True), |
|
264 | 259 | sa.Column('owner_user_id', sa.Integer), |
|
265 | 260 | sa.Column('type', sa.Integer, index=True, nullable=False), |
|
266 | 261 | sa.Column('duration', sa.Float(), server_default='0'), |
|
267 | 262 | sa.Column('server_name', sa.Unicode(128), |
|
268 | 263 | server_default=''), |
|
269 | 264 | sa.Column('view_name', sa.Unicode(128), |
|
270 | 265 | server_default=''), |
|
271 | 266 | sa.Column('id', sa.BigInteger(), nullable=False, primary_key=True), |
|
272 | 267 | ) |
|
273 | 268 | op.create_index('ix_reports_stats_start_interval_group_id', 'reports_stats', |
|
274 | 269 | ["start_interval", "group_id"]) |
|
275 | 270 | |
|
276 | 271 | op.create_table( |
|
277 | 272 | 'slow_calls', |
|
278 | 273 | sa.Column('id', sa.types.BigInteger, primary_key=True), |
|
279 | 274 | sa.Column('report_id', sa.types.BigInteger, sa.ForeignKey('reports.id', onupdate='cascade', ondelete='cascade'), |
|
280 | 275 | nullable=False, index=True), |
|
281 | 276 | sa.Column('duration', sa.Float(), nullable=False, server_default="0", index=True), |
|
282 | 277 | sa.Column('timestamp', sa.DateTime, nullable=False, server_default=sa.func.now(), index=True), |
|
283 | 278 | sa.Column('report_group_time', sa.DateTime, index=True, nullable=False, server_default=sa.func.now()), |
|
284 | 279 | sa.Column('type', sa.Unicode(16), nullable=False, index=True), |
|
285 | 280 | sa.Column('statement', sa.UnicodeText, nullable=False, server_default=''), |
|
286 | 281 | sa.Column('parameters', sa.dialects.postgresql.JSON, nullable=False), |
|
287 | 282 | sa.Column('location', sa.UnicodeText, server_default=''), |
|
288 | 283 | sa.Column('subtype', sa.Unicode(16), nullable=False, index=True), |
|
289 | 284 | sa.Column('resource_id', sa.Integer, nullable=False, index=True), |
|
290 | 285 | sa.Column('statement_hash', sa.Unicode(60), index=True) |
|
291 | 286 | ) |
|
292 | 287 | |
|
293 | 288 | op.create_table( |
|
294 | 289 | 'tags', |
|
295 | 290 | sa.Column('id', sa.types.BigInteger, primary_key=True), |
|
296 | 291 | sa.Column('resource_id', sa.Integer, |
|
297 | 292 | sa.ForeignKey('resources.resource_id', onupdate='cascade', |
|
298 | 293 | ondelete='cascade')), |
|
299 | 294 | sa.Column('first_timestamp', sa.DateTime, nullable=False, server_default=sa.func.now()), |
|
300 | 295 | sa.Column('last_timestamp', sa.DateTime, nullable=False, server_default=sa.func.now()), |
|
301 | 296 | sa.Column('name', sa.Unicode(32), nullable=False), |
|
302 | 297 | sa.Column('value', sa.dialects.postgresql.JSON, nullable=False), |
|
303 | 298 | sa.Column('times_seen', sa.Integer, nullable=False, server_default='1') |
|
304 | 299 | ) |
|
305 | 300 | |
|
306 | 301 | op.create_table( |
|
307 | 302 | 'auth_tokens', |
|
308 | 303 | sa.Column('id', sa.Integer, nullable=False, primary_key=True), |
|
309 | 304 | sa.Column('token', sa.Unicode), |
|
310 | 305 | sa.Column('creation_date', sa.DateTime, nullable=False, server_default=sa.func.now()), |
|
311 | 306 | sa.Column('expires', sa.DateTime), |
|
312 | 307 | sa.Column('owner_id', sa.Integer, |
|
313 | 308 | sa.ForeignKey('users.id', onupdate='cascade', |
|
314 | 309 | ondelete='cascade')), |
|
315 | 310 | sa.Column('description', sa.Unicode), |
|
316 | 311 | ) |
|
317 | 312 | |
|
318 | 313 | op.create_table( |
|
319 | 314 | 'channels_actions', |
|
320 | 315 | sa.Column('channel_pkey', sa.Integer, |
|
321 | 316 | sa.ForeignKey('alert_channels.pkey', |
|
322 | 317 | ondelete='CASCADE', onupdate='CASCADE')), |
|
323 | 318 | sa.Column('action_pkey', sa.Integer, |
|
324 | 319 | sa.ForeignKey('alert_channels_actions.pkey', |
|
325 | 320 | ondelete='CASCADE', onupdate='CASCADE')) |
|
326 | 321 | ) |
|
327 | 322 | |
|
328 | 323 | op.create_table( |
|
329 | 324 | 'config', |
|
330 | 325 | sa.Column('key', sa.Unicode(128), primary_key=True), |
|
331 | 326 | sa.Column('section', sa.Unicode(128), primary_key=True), |
|
332 | 327 | sa.Column('value', sa.dialects.postgresql.JSON, |
|
333 | 328 | server_default="{}") |
|
334 | 329 | ) |
|
335 | 330 | |
|
336 | 331 | op.create_table( |
|
337 | 332 | 'plugin_configs', |
|
338 | 333 | sa.Column('id', sa.Integer, primary_key=True), |
|
339 | 334 | sa.Column('plugin_name', sa.Unicode(128)), |
|
340 | 335 | sa.Column('section', sa.Unicode(128)), |
|
341 | 336 | sa.Column('config', sa.dialects.postgresql.JSON, |
|
342 | 337 | server_default="{}"), |
|
343 | 338 | sa.Column('resource_id', sa.Integer(), |
|
344 | 339 | sa.ForeignKey('resources.resource_id', onupdate='cascade', |
|
345 | 340 | ondelete='cascade')), |
|
346 | 341 | sa.Column('owner_id', sa.Integer(), |
|
347 | 342 | sa.ForeignKey('users.id', onupdate='cascade', |
|
348 | 343 | ondelete='cascade'))) |
|
349 | 344 | |
|
350 | 345 | op.create_table( |
|
351 | 346 | 'rc_versions', |
|
352 | 347 | sa.Column('name', sa.Unicode(40), primary_key=True), |
|
353 | 348 | sa.Column('value', sa.Unicode(40)), |
|
354 | 349 | ) |
|
355 | 350 | version_table = sa.table('rc_versions', |
|
356 | 351 | sa.Column('name', sa.Unicode(40)), |
|
357 | 352 | sa.Column('value', sa.Unicode(40))) |
|
358 | 353 | |
|
359 | 354 | insert = version_table.insert().values(name='es_reports') |
|
360 | 355 | op.execute(insert) |
|
361 | 356 | insert = version_table.insert().values(name='es_reports_groups') |
|
362 | 357 | op.execute(insert) |
|
363 | 358 | insert = version_table.insert().values(name='es_reports_stats') |
|
364 | 359 | op.execute(insert) |
|
365 | 360 | insert = version_table.insert().values(name='es_logs') |
|
366 | 361 | op.execute(insert) |
|
367 | 362 | insert = version_table.insert().values(name='es_metrics') |
|
368 | 363 | op.execute(insert) |
|
369 | 364 | insert = version_table.insert().values(name='es_slow_calls') |
|
370 | 365 | op.execute(insert) |
|
371 | 366 | |
|
372 | 367 | |
|
373 | 368 | op.execute(''' |
|
374 | 369 | CREATE OR REPLACE FUNCTION floor_time_5min(timestamp without time zone) |
|
375 | 370 | RETURNS timestamp without time zone AS |
|
376 | 371 | $BODY$SELECT date_trunc('hour', $1) + INTERVAL '5 min' * FLOOR(date_part('minute', $1) / 5.0)$BODY$ |
|
377 | 372 | LANGUAGE sql VOLATILE; |
|
378 | 373 | ''') |
|
379 | 374 | |
|
380 | 375 | op.execute(''' |
|
381 | 376 | CREATE OR REPLACE FUNCTION partition_logs() RETURNS trigger |
|
382 | 377 | LANGUAGE plpgsql SECURITY DEFINER |
|
383 | 378 | AS $$ |
|
384 | 379 | DECLARE |
|
385 | 380 | main_table varchar := 'logs'; |
|
386 | 381 | partitioned_table varchar := ''; |
|
387 | 382 | BEGIN |
|
388 | 383 | |
|
389 | 384 | IF NEW.permanent THEN |
|
390 | 385 | partitioned_table := main_table || '_p_' || date_part('year', NEW.timestamp)::TEXT || '_' || DATE_part('month', NEW.timestamp); |
|
391 | 386 | ELSE |
|
392 | 387 | partitioned_table := main_table || '_p_' || date_part('year', NEW.timestamp)::TEXT || '_' || DATE_part('month', NEW.timestamp) || '_' || DATE_part('day', NEW.timestamp); |
|
393 | 388 | END IF; |
|
394 | 389 | |
|
395 | 390 | BEGIN |
|
396 | 391 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; |
|
397 | 392 | EXCEPTION |
|
398 | 393 | WHEN undefined_table THEN |
|
399 | 394 | RAISE NOTICE 'A partition has been created %', partitioned_table; |
|
400 | 395 | IF NEW.permanent THEN |
|
401 | 396 | EXECUTE format('CREATE TABLE IF NOT EXISTS %s ( CHECK( timestamp >= DATE %s AND timestamp < DATE %s)) INHERITS (%s)', |
|
402 | 397 | partitioned_table, |
|
403 | 398 | quote_literal(date_trunc('month', NEW.timestamp)::date) , |
|
404 | 399 | quote_literal((date_trunc('month', NEW.timestamp)::date + interval '1 month')::text), |
|
405 | 400 | main_table); |
|
406 | 401 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT pk_%s PRIMARY KEY(log_id);', partitioned_table, partitioned_table); |
|
407 | 402 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT fk_%s_resource_id FOREIGN KEY (resource_id) REFERENCES resources (resource_id) MATCH SIMPLE ON UPDATE CASCADE ON DELETE CASCADE;', partitioned_table, partitioned_table); |
|
408 | 403 | EXECUTE format('CREATE INDEX ix_%s_timestamp ON %s (timestamp);', partitioned_table, partitioned_table); |
|
409 | 404 | EXECUTE format('CREATE INDEX ix_%s_namespace_resource_id ON %s (namespace, resource_id);', partitioned_table, partitioned_table); |
|
410 | 405 | EXECUTE format('CREATE INDEX ix_%s_resource_id ON %s (resource_id);', partitioned_table, partitioned_table); |
|
411 | 406 | EXECUTE format('CREATE INDEX ix_%s_pkey_namespace ON %s (primary_key, namespace);', partitioned_table, partitioned_table); |
|
412 | 407 | ELSE |
|
413 | 408 | EXECUTE format('CREATE TABLE IF NOT EXISTS %s ( CHECK( timestamp >= DATE %s AND timestamp < DATE %s)) INHERITS (%s)', |
|
414 | 409 | partitioned_table, |
|
415 | 410 | quote_literal(date_trunc('day', NEW.timestamp)::date) , |
|
416 | 411 | quote_literal((date_trunc('day', NEW.timestamp)::date + interval '1 day')::text), |
|
417 | 412 | main_table); |
|
418 | 413 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT pk_%s_ PRIMARY KEY(log_id);', partitioned_table, partitioned_table); |
|
419 | 414 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT fk_%s_resource_id FOREIGN KEY (resource_id) REFERENCES resources (resource_id) MATCH SIMPLE ON UPDATE CASCADE ON DELETE CASCADE;', partitioned_table, partitioned_table); |
|
420 | 415 | EXECUTE format('CREATE INDEX ix_%s_timestamp ON %s (timestamp);', partitioned_table, partitioned_table); |
|
421 | 416 | EXECUTE format('CREATE INDEX ix_%s_namespace_resource_id ON %s (namespace, resource_id);', partitioned_table, partitioned_table); |
|
422 | 417 | EXECUTE format('CREATE INDEX ix_%s_resource_id ON %s (resource_id);', partitioned_table, partitioned_table); |
|
423 | 418 | EXECUTE format('CREATE INDEX ix_%s_primary_key_namespace ON %s (primary_key,namespace);', partitioned_table, partitioned_table); |
|
424 | 419 | END IF; |
|
425 | 420 | |
|
426 | 421 | |
|
427 | 422 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; |
|
428 | 423 | END; |
|
429 | 424 | |
|
430 | 425 | |
|
431 | 426 | RETURN NULL; |
|
432 | 427 | END |
|
433 | 428 | $$; |
|
434 | 429 | ''') |
|
435 | 430 | |
|
436 | 431 | op.execute(''' |
|
437 | 432 | CREATE TRIGGER partition_logs BEFORE INSERT ON logs FOR EACH ROW EXECUTE PROCEDURE partition_logs(); |
|
438 | 433 | ''') |
|
439 | 434 | |
|
440 | 435 | op.execute(''' |
|
441 | 436 | CREATE OR REPLACE FUNCTION partition_metrics() RETURNS trigger |
|
442 | 437 | LANGUAGE plpgsql SECURITY DEFINER |
|
443 | 438 | AS $$ |
|
444 | 439 | DECLARE |
|
445 | 440 | main_table varchar := 'metrics'; |
|
446 | 441 | partitioned_table varchar := ''; |
|
447 | 442 | BEGIN |
|
448 | 443 | |
|
449 | 444 | partitioned_table := main_table || '_p_' || date_part('year', NEW.timestamp)::TEXT || '_' || DATE_part('month', NEW.timestamp) || '_' || DATE_part('day', NEW.timestamp); |
|
450 | 445 | |
|
451 | 446 | BEGIN |
|
452 | 447 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; |
|
453 | 448 | EXCEPTION |
|
454 | 449 | WHEN undefined_table THEN |
|
455 | 450 | RAISE NOTICE 'A partition has been created %', partitioned_table; |
|
456 | 451 | EXECUTE format('CREATE TABLE IF NOT EXISTS %s ( CHECK( timestamp >= DATE %s AND timestamp < DATE %s)) INHERITS (%s)', |
|
457 | 452 | partitioned_table, |
|
458 | 453 | quote_literal(date_trunc('day', NEW.timestamp)::date) , |
|
459 | 454 | quote_literal((date_trunc('day', NEW.timestamp)::date + interval '1 day')::text), |
|
460 | 455 | main_table); |
|
461 | 456 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT pk_%s PRIMARY KEY(pkey);', partitioned_table, partitioned_table); |
|
462 | 457 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT fk_%s_resource_id FOREIGN KEY (resource_id) REFERENCES resources (resource_id) MATCH SIMPLE ON UPDATE CASCADE ON DELETE CASCADE;', partitioned_table, partitioned_table); |
|
463 | 458 | EXECUTE format('CREATE INDEX ix_%s_timestamp ON %s (timestamp);', partitioned_table, partitioned_table); |
|
464 | 459 | EXECUTE format('CREATE INDEX ix_%s_resource_id ON %s (resource_id);', partitioned_table, partitioned_table); |
|
465 | 460 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; |
|
466 | 461 | END; |
|
467 | 462 | |
|
468 | 463 | RETURN NULL; |
|
469 | 464 | END |
|
470 | 465 | $$; |
|
471 | 466 | ''') |
|
472 | 467 | |
|
473 | 468 | op.execute(''' |
|
474 | 469 | CREATE TRIGGER partition_metrics BEFORE INSERT ON metrics FOR EACH ROW EXECUTE PROCEDURE partition_metrics(); |
|
475 | 470 | ''') |
|
476 | 471 | |
|
477 | 472 | op.execute(''' |
|
478 | 473 | CREATE FUNCTION partition_reports_stats() RETURNS trigger |
|
479 | 474 | LANGUAGE plpgsql SECURITY DEFINER |
|
480 | 475 | AS $$ |
|
481 | 476 | DECLARE |
|
482 | 477 | main_table varchar := 'reports_stats'; |
|
483 | 478 | partitioned_table varchar := ''; |
|
484 | 479 | BEGIN |
|
485 | 480 | |
|
486 | 481 | partitioned_table := main_table || '_p_' || date_part('year', NEW.start_interval)::TEXT || '_' || DATE_part('month', NEW.start_interval); |
|
487 | 482 | |
|
488 | 483 | BEGIN |
|
489 | 484 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; |
|
490 | 485 | EXCEPTION |
|
491 | 486 | WHEN undefined_table THEN |
|
492 | 487 | RAISE NOTICE 'A partition has been created %', partitioned_table; |
|
493 | 488 | EXECUTE format('CREATE TABLE IF NOT EXISTS %s ( CHECK( start_interval >= DATE %s AND start_interval < DATE %s )) INHERITS (%s)', |
|
494 | 489 | partitioned_table, |
|
495 | 490 | quote_literal(date_trunc('month', NEW.start_interval)::date) , |
|
496 | 491 | quote_literal((date_trunc('month', NEW.start_interval)::date + interval '1 month')::text), |
|
497 | 492 | main_table); |
|
498 | 493 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT pk_%s PRIMARY KEY(id);', partitioned_table, partitioned_table); |
|
499 | 494 | EXECUTE format('CREATE INDEX ix_%s_start_interval ON %s USING btree (start_interval);', partitioned_table, partitioned_table); |
|
500 | 495 | EXECUTE format('CREATE INDEX ix_%s_type ON %s USING btree (type);', partitioned_table, partitioned_table); |
|
501 | 496 | EXECUTE format('CREATE INDEX ix_%s_resource_id ON %s USING btree (resource_id);', partitioned_table, partitioned_table); |
|
502 | 497 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; |
|
503 | 498 | END; |
|
504 | 499 | RETURN NULL; |
|
505 | 500 | END |
|
506 | 501 | $$; |
|
507 | 502 | ''') |
|
508 | 503 | |
|
509 | 504 | op.execute(''' |
|
510 | 505 | CREATE TRIGGER partition_reports_stats BEFORE INSERT ON reports_stats FOR EACH ROW EXECUTE PROCEDURE partition_reports_stats(); |
|
511 | 506 | ''') |
|
512 | 507 | |
|
513 | 508 | op.execute(''' |
|
514 | 509 | CREATE OR REPLACE FUNCTION partition_reports_groups() RETURNS trigger |
|
515 | 510 | LANGUAGE plpgsql SECURITY DEFINER |
|
516 | 511 | AS $$ |
|
517 | 512 | DECLARE |
|
518 | 513 | main_table varchar := 'reports_groups'; |
|
519 | 514 | partitioned_table varchar := ''; |
|
520 | 515 | BEGIN |
|
521 | 516 | |
|
522 | 517 | partitioned_table := main_table || '_p_' || date_part('year', NEW.first_timestamp)::TEXT || '_' || DATE_part('month', NEW.first_timestamp); |
|
523 | 518 | |
|
524 | 519 | BEGIN |
|
525 | 520 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; |
|
526 | 521 | EXCEPTION |
|
527 | 522 | WHEN undefined_table THEN |
|
528 | 523 | RAISE NOTICE 'A partition has been created %', partitioned_table; |
|
529 | 524 | EXECUTE format('CREATE TABLE IF NOT EXISTS %s ( CHECK( first_timestamp >= DATE %s AND first_timestamp < DATE %s )) INHERITS (%s)', |
|
530 | 525 | partitioned_table, |
|
531 | 526 | quote_literal(date_trunc('month', NEW.first_timestamp)::date) , |
|
532 | 527 | quote_literal((date_trunc('month', NEW.first_timestamp)::date + interval '1 month')::text), |
|
533 | 528 | main_table); |
|
534 | 529 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT pk_%s PRIMARY KEY(id);', partitioned_table, partitioned_table); |
|
535 | 530 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT fk_%s_resource_id FOREIGN KEY (resource_id) REFERENCES resources (resource_id) MATCH SIMPLE ON UPDATE CASCADE ON DELETE CASCADE;', partitioned_table, partitioned_table); |
|
536 | 531 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; |
|
537 | 532 | END; |
|
538 | 533 | RETURN NULL; |
|
539 | 534 | END |
|
540 | 535 | $$; |
|
541 | 536 | ''') |
|
542 | 537 | |
|
543 | 538 | op.execute(''' |
|
544 | 539 | CREATE TRIGGER partition_reports_groups BEFORE INSERT ON reports_groups FOR EACH ROW EXECUTE PROCEDURE partition_reports_groups(); |
|
545 | 540 | ''') |
|
546 | 541 | |
|
547 | 542 | op.execute(''' |
|
548 | 543 | CREATE OR REPLACE FUNCTION partition_reports() RETURNS trigger |
|
549 | 544 | LANGUAGE plpgsql SECURITY DEFINER |
|
550 | 545 | AS $$ |
|
551 | 546 | DECLARE |
|
552 | 547 | main_table varchar := 'reports'; |
|
553 | 548 | partitioned_table varchar := ''; |
|
554 | 549 | partitioned_parent_table varchar := ''; |
|
555 | 550 | BEGIN |
|
556 | 551 | |
|
557 | 552 | partitioned_table := main_table || '_p_' || date_part('year', NEW.report_group_time)::TEXT || '_' || DATE_part('month', NEW.report_group_time); |
|
558 | 553 | partitioned_parent_table := 'reports_groups_p_' || date_part('year', NEW.report_group_time)::TEXT || '_' || DATE_part('month', NEW.report_group_time); |
|
559 | 554 | |
|
560 | 555 | BEGIN |
|
561 | 556 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; |
|
562 | 557 | EXCEPTION |
|
563 | 558 | WHEN undefined_table THEN |
|
564 | 559 | RAISE NOTICE 'A partition has been created %', partitioned_table; |
|
565 | 560 | EXECUTE format('CREATE TABLE IF NOT EXISTS %s ( CHECK( report_group_time >= DATE %s AND report_group_time < DATE %s )) INHERITS (%s)', |
|
566 | 561 | partitioned_table, |
|
567 | 562 | quote_literal(date_trunc('month', NEW.report_group_time)::date) , |
|
568 | 563 | quote_literal((date_trunc('month', NEW.report_group_time)::date + interval '1 month')::text), |
|
569 | 564 | main_table); |
|
570 | 565 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT pk_%s PRIMARY KEY(id);', partitioned_table, partitioned_table); |
|
571 | 566 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT fk_%s_resource_id FOREIGN KEY (resource_id) REFERENCES resources (resource_id) MATCH SIMPLE ON UPDATE CASCADE ON DELETE CASCADE;', partitioned_table, partitioned_table); |
|
572 | 567 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT fk_%s_group_id FOREIGN KEY (group_id) REFERENCES %s (id) MATCH SIMPLE ON UPDATE CASCADE ON DELETE CASCADE;', partitioned_table, partitioned_table, partitioned_parent_table); |
|
573 | 568 | EXECUTE format('CREATE INDEX ix_%s_report_group_time ON %s USING btree (report_group_time);', partitioned_table, partitioned_table); |
|
574 | 569 | EXECUTE format('CREATE INDEX ix_%s_group_id ON %s USING btree (group_id);', partitioned_table, partitioned_table); |
|
575 | 570 | EXECUTE format('CREATE INDEX ix_%s_resource_id ON %s USING btree (resource_id);', partitioned_table, partitioned_table); |
|
576 | 571 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; |
|
577 | 572 | END; |
|
578 | 573 | RETURN NULL; |
|
579 | 574 | END |
|
580 | 575 | $$; |
|
581 | 576 | ''') |
|
582 | 577 | |
|
583 | 578 | op.execute(''' |
|
584 | 579 | CREATE TRIGGER partition_reports BEFORE INSERT ON reports FOR EACH ROW EXECUTE PROCEDURE partition_reports(); |
|
585 | 580 | ''') |
|
586 | 581 | |
|
587 | 582 | |
|
588 | 583 | op.execute(''' |
|
589 | 584 | CREATE OR REPLACE FUNCTION partition_slow_calls() RETURNS trigger |
|
590 | 585 | LANGUAGE plpgsql SECURITY DEFINER |
|
591 | 586 | AS $$ |
|
592 | 587 | DECLARE |
|
593 | 588 | main_table varchar := 'slow_calls'; |
|
594 | 589 | partitioned_table varchar := ''; |
|
595 | 590 | partitioned_parent_table varchar := ''; |
|
596 | 591 | BEGIN |
|
597 | 592 | |
|
598 | 593 | partitioned_table := main_table || '_p_' || date_part('year', NEW.report_group_time)::TEXT || '_' || DATE_part('month', NEW.report_group_time); |
|
599 | 594 | partitioned_parent_table := 'reports_p_' || date_part('year', NEW.report_group_time)::TEXT || '_' || DATE_part('month', NEW.report_group_time); |
|
600 | 595 | |
|
601 | 596 | BEGIN |
|
602 | 597 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; |
|
603 | 598 | EXCEPTION |
|
604 | 599 | WHEN undefined_table THEN |
|
605 | 600 | RAISE NOTICE 'A partition has been created %', partitioned_table; |
|
606 | 601 | EXECUTE format('CREATE TABLE IF NOT EXISTS %s ( CHECK( report_group_time >= DATE %s AND report_group_time < DATE %s )) INHERITS (%s)', |
|
607 | 602 | partitioned_table, |
|
608 | 603 | quote_literal(date_trunc('month', NEW.report_group_time)::date) , |
|
609 | 604 | quote_literal((date_trunc('month', NEW.report_group_time)::date + interval '1 month')::text), |
|
610 | 605 | main_table); |
|
611 | 606 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT pk_%s PRIMARY KEY(id);', partitioned_table, partitioned_table); |
|
612 | 607 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT fk_%s_resource_id FOREIGN KEY (resource_id) REFERENCES resources (resource_id) MATCH SIMPLE ON UPDATE CASCADE ON DELETE CASCADE;', partitioned_table, partitioned_table); |
|
613 | 608 | EXECUTE format('ALTER TABLE %s ADD CONSTRAINT fk_%s_report_id FOREIGN KEY (report_id) REFERENCES %s (id) MATCH SIMPLE ON UPDATE CASCADE ON DELETE CASCADE;', partitioned_table, partitioned_table, partitioned_parent_table); |
|
614 | 609 | EXECUTE format('CREATE INDEX ix_%s_resource_id ON %s USING btree (resource_id);', partitioned_table, partitioned_table); |
|
615 | 610 | EXECUTE format('CREATE INDEX ix_%s_report_id ON %s USING btree (report_id);', partitioned_table, partitioned_table); |
|
616 | 611 | EXECUTE format('CREATE INDEX ix_%s_timestamp ON %s USING btree (timestamp);', partitioned_table, partitioned_table); |
|
617 | 612 | EXECUTE 'INSERT INTO ' || partitioned_table || ' SELECT(' || TG_TABLE_NAME || ' ' || quote_literal(NEW) || ').*;'; |
|
618 | 613 | END; |
|
619 | 614 | RETURN NULL; |
|
620 | 615 | END |
|
621 | 616 | $$; |
|
622 | 617 | ''') |
|
623 | 618 | |
|
624 | 619 | op.execute(''' |
|
625 | 620 | CREATE TRIGGER partition_slow_calls BEFORE INSERT ON slow_calls FOR EACH ROW EXECUTE PROCEDURE partition_slow_calls(); |
|
626 | 621 | ''') |
|
627 | 622 | |
|
628 | 623 | def downgrade(): |
|
629 | 624 | pass |
@@ -1,135 +1,130 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import logging |
|
23 | 18 | |
|
24 | 19 | from sqlalchemy.ext.declarative import declarative_base |
|
25 | 20 | from sqlalchemy import MetaData |
|
26 | 21 | from sqlalchemy.orm import scoped_session |
|
27 | 22 | from sqlalchemy.orm import sessionmaker |
|
28 | 23 | from zope.sqlalchemy import ZopeTransactionExtension |
|
29 | 24 | import ziggurat_foundations |
|
30 | 25 | from ziggurat_foundations.models.base import get_db_session |
|
31 | 26 | |
|
32 | 27 | log = logging.getLogger(__name__) |
|
33 | 28 | |
|
34 | 29 | DBSession = scoped_session(sessionmaker(extension=ZopeTransactionExtension())) |
|
35 | 30 | |
|
36 | 31 | NAMING_CONVENTION = { |
|
37 | 32 | "ix": 'ix_%(column_0_label)s', |
|
38 | 33 | "uq": "uq_%(table_name)s_%(column_0_name)s", |
|
39 | 34 | "ck": "ck_%(table_name)s_%(constraint_name)s", |
|
40 | 35 | "fk": "fk_%(table_name)s_%(column_0_name)s_%(referred_table_name)s", |
|
41 | 36 | "pk": "pk_%(table_name)s" |
|
42 | 37 | } |
|
43 | 38 | |
|
44 | 39 | metadata = MetaData(naming_convention=NAMING_CONVENTION) |
|
45 | 40 | Base = declarative_base(metadata=metadata) |
|
46 | 41 | |
|
47 | 42 | # optional for request.db approach |
|
48 | 43 | ziggurat_foundations.models.DBSession = DBSession |
|
49 | 44 | |
|
50 | 45 | |
|
51 | 46 | class Datastores(object): |
|
52 | 47 | redis = None |
|
53 | 48 | es = None |
|
54 | 49 | |
|
55 | 50 | |
|
56 | 51 | def register_datastores(es_conn, redis_conn, redis_lockmgr): |
|
57 | 52 | Datastores.es = es_conn |
|
58 | 53 | Datastores.redis = redis_conn |
|
59 | 54 | Datastores.lockmgr = redis_lockmgr |
|
60 | 55 | |
|
61 | 56 | |
|
62 | 57 | class SliceableESQuery(object): |
|
63 | 58 | def __init__(self, query, sort_query=None, aggregations=False, **kwconfig): |
|
64 | 59 | self.query = query |
|
65 | 60 | self.sort_query = sort_query |
|
66 | 61 | self.aggregations = aggregations |
|
67 | 62 | self.items_per_page = kwconfig.pop('items_per_page', 10) |
|
68 | 63 | self.page = kwconfig.pop('page', 1) |
|
69 | 64 | self.kwconfig = kwconfig |
|
70 | 65 | self.result = None |
|
71 | 66 | |
|
72 | 67 | def __getitem__(self, index): |
|
73 | 68 | config = self.kwconfig.copy() |
|
74 | 69 | config['es_from'] = index.start |
|
75 | 70 | query = self.query.copy() |
|
76 | 71 | if self.sort_query: |
|
77 | 72 | query.update(self.sort_query) |
|
78 | 73 | self.result = Datastores.es.search(query, size=self.items_per_page, |
|
79 | 74 | **config) |
|
80 | 75 | if self.aggregations: |
|
81 | 76 | self.items = self.result.get('aggregations') |
|
82 | 77 | else: |
|
83 | 78 | self.items = self.result['hits']['hits'] |
|
84 | 79 | |
|
85 | 80 | return self.items |
|
86 | 81 | |
|
87 | 82 | def __iter__(self): |
|
88 | 83 | return self.result |
|
89 | 84 | |
|
90 | 85 | def __len__(self): |
|
91 | 86 | config = self.kwconfig.copy() |
|
92 | 87 | query = self.query.copy() |
|
93 | 88 | self.result = Datastores.es.search(query, size=self.items_per_page, |
|
94 | 89 | **config) |
|
95 | 90 | if self.aggregations: |
|
96 | 91 | self.items = self.result.get('aggregations') |
|
97 | 92 | else: |
|
98 | 93 | self.items = self.result['hits']['hits'] |
|
99 | 94 | |
|
100 | 95 | count = int(self.result['hits']['total']) |
|
101 | 96 | return count if count < 5000 else 5000 |
|
102 | 97 | |
|
103 | 98 | |
|
104 | 99 | from appenlight.models.resource import Resource |
|
105 | 100 | from appenlight.models.application import Application |
|
106 | 101 | from appenlight.models.user import User |
|
107 | 102 | from appenlight.models.alert_channel import AlertChannel |
|
108 | 103 | from appenlight.models.alert_channel_action import AlertChannelAction |
|
109 | 104 | from appenlight.models.metric import Metric |
|
110 | 105 | from appenlight.models.application_postprocess_conf import \ |
|
111 | 106 | ApplicationPostprocessConf |
|
112 | 107 | from appenlight.models.auth_token import AuthToken |
|
113 | 108 | from appenlight.models.event import Event |
|
114 | 109 | from appenlight.models.external_identity import ExternalIdentity |
|
115 | 110 | from appenlight.models.group import Group |
|
116 | 111 | from appenlight.models.group_permission import GroupPermission |
|
117 | 112 | from appenlight.models.group_resource_permission import GroupResourcePermission |
|
118 | 113 | from appenlight.models.log import Log |
|
119 | 114 | from appenlight.models.plugin_config import PluginConfig |
|
120 | 115 | from appenlight.models.report import Report |
|
121 | 116 | from appenlight.models.report_group import ReportGroup |
|
122 | 117 | from appenlight.models.report_comment import ReportComment |
|
123 | 118 | from appenlight.models.report_assignment import ReportAssignment |
|
124 | 119 | from appenlight.models.report_stat import ReportStat |
|
125 | 120 | from appenlight.models.slow_call import SlowCall |
|
126 | 121 | from appenlight.models.tag import Tag |
|
127 | 122 | from appenlight.models.user_group import UserGroup |
|
128 | 123 | from appenlight.models.user_permission import UserPermission |
|
129 | 124 | from appenlight.models.user_resource_permission import UserResourcePermission |
|
130 | 125 | from ziggurat_foundations import ziggurat_model_init |
|
131 | 126 | |
|
132 | 127 | ziggurat_model_init(User, Group, UserGroup, GroupPermission, UserPermission, |
|
133 | 128 | UserResourcePermission, GroupResourcePermission, |
|
134 | 129 | Resource, |
|
135 | 130 | ExternalIdentity, passwordmanager=None) |
@@ -1,296 +1,291 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import logging |
|
23 | 18 | import sqlalchemy as sa |
|
24 | 19 | import urllib.request, urllib.parse, urllib.error |
|
25 | 20 | from datetime import timedelta |
|
26 | 21 | from appenlight.models import Base |
|
27 | 22 | from appenlight.lib.utils.date_utils import convert_date |
|
28 | 23 | from sqlalchemy.dialects.postgresql import JSON |
|
29 | 24 | from ziggurat_foundations.models.base import BaseModel |
|
30 | 25 | |
|
31 | 26 | log = logging.getLogger(__name__) |
|
32 | 27 | |
|
33 | 28 | # |
|
34 | 29 | channel_rules_m2m_table = sa.Table( |
|
35 | 30 | 'channels_actions', Base.metadata, |
|
36 | 31 | sa.Column('channel_pkey', sa.Integer, |
|
37 | 32 | sa.ForeignKey('alert_channels.pkey')), |
|
38 | 33 | sa.Column('action_pkey', sa.Integer, |
|
39 | 34 | sa.ForeignKey('alert_channels_actions.pkey')) |
|
40 | 35 | ) |
|
41 | 36 | |
|
42 | 37 | DATE_FRMT = '%Y-%m-%dT%H:%M' |
|
43 | 38 | |
|
44 | 39 | |
|
45 | 40 | class AlertChannel(Base, BaseModel): |
|
46 | 41 | """ |
|
47 | 42 | Stores information about possible alerting options |
|
48 | 43 | """ |
|
49 | 44 | __tablename__ = 'alert_channels' |
|
50 | 45 | __possible_channel_names__ = ['email'] |
|
51 | 46 | __mapper_args__ = { |
|
52 | 47 | 'polymorphic_on': 'channel_name', |
|
53 | 48 | 'polymorphic_identity': 'integration' |
|
54 | 49 | } |
|
55 | 50 | |
|
56 | 51 | owner_id = sa.Column(sa.Unicode(30), |
|
57 | 52 | sa.ForeignKey('users.id', onupdate='CASCADE', |
|
58 | 53 | ondelete='CASCADE')) |
|
59 | 54 | channel_name = sa.Column(sa.Unicode(25), nullable=False) |
|
60 | 55 | channel_value = sa.Column(sa.Unicode(80), nullable=False, default='') |
|
61 | 56 | channel_json_conf = sa.Column(JSON(), nullable=False, default='') |
|
62 | 57 | channel_validated = sa.Column(sa.Boolean, nullable=False, |
|
63 | 58 | default=False) |
|
64 | 59 | send_alerts = sa.Column(sa.Boolean, nullable=False, |
|
65 | 60 | default=True) |
|
66 | 61 | daily_digest = sa.Column(sa.Boolean, nullable=False, |
|
67 | 62 | default=True) |
|
68 | 63 | integration_id = sa.Column(sa.Integer, sa.ForeignKey('integrations.id'), |
|
69 | 64 | nullable=True) |
|
70 | 65 | pkey = sa.Column(sa.Integer(), nullable=False, primary_key=True) |
|
71 | 66 | |
|
72 | 67 | channel_actions = sa.orm.relationship('AlertChannelAction', |
|
73 | 68 | cascade="all", |
|
74 | 69 | passive_deletes=True, |
|
75 | 70 | passive_updates=True, |
|
76 | 71 | secondary=channel_rules_m2m_table, |
|
77 | 72 | backref='channels') |
|
78 | 73 | |
|
79 | 74 | @property |
|
80 | 75 | def channel_visible_value(self): |
|
81 | 76 | if self.integration: |
|
82 | 77 | return '{}: {}'.format( |
|
83 | 78 | self.channel_name, |
|
84 | 79 | self.integration.resource.resource_name |
|
85 | 80 | ) |
|
86 | 81 | |
|
87 | 82 | return '{}: {}'.format( |
|
88 | 83 | self.channel_name, |
|
89 | 84 | self.channel_value |
|
90 | 85 | ) |
|
91 | 86 | |
|
92 | 87 | def get_dict(self, exclude_keys=None, include_keys=None, |
|
93 | 88 | extended_info=True): |
|
94 | 89 | """ |
|
95 | 90 | Returns dictionary with required information that will be consumed by |
|
96 | 91 | angular |
|
97 | 92 | """ |
|
98 | 93 | instance_dict = super(AlertChannel, self).get_dict(exclude_keys, |
|
99 | 94 | include_keys) |
|
100 | 95 | exclude_keys_list = exclude_keys or [] |
|
101 | 96 | include_keys_list = include_keys or [] |
|
102 | 97 | |
|
103 | 98 | instance_dict['supports_report_alerting'] = True |
|
104 | 99 | instance_dict['channel_visible_value'] = self.channel_visible_value |
|
105 | 100 | |
|
106 | 101 | if extended_info: |
|
107 | 102 | instance_dict['actions'] = [ |
|
108 | 103 | rule.get_dict(extended_info=True) for |
|
109 | 104 | rule in self.channel_actions] |
|
110 | 105 | |
|
111 | 106 | del instance_dict['channel_json_conf'] |
|
112 | 107 | |
|
113 | 108 | if self.integration: |
|
114 | 109 | instance_dict[ |
|
115 | 110 | 'supports_report_alerting'] = \ |
|
116 | 111 | self.integration.supports_report_alerting |
|
117 | 112 | d = {} |
|
118 | 113 | for k in instance_dict.keys(): |
|
119 | 114 | if (k not in exclude_keys_list and |
|
120 | 115 | (k in include_keys_list or not include_keys)): |
|
121 | 116 | d[k] = instance_dict[k] |
|
122 | 117 | return d |
|
123 | 118 | |
|
124 | 119 | def __repr__(self): |
|
125 | 120 | return '<AlertChannel: (%s,%s), user:%s>' % (self.channel_name, |
|
126 | 121 | self.channel_value, |
|
127 | 122 | self.user_name,) |
|
128 | 123 | |
|
129 | 124 | def send_digest(self, **kwargs): |
|
130 | 125 | """ |
|
131 | 126 | This should implement daily top error report notifications |
|
132 | 127 | """ |
|
133 | 128 | log.warning('send_digest NOT IMPLEMENTED') |
|
134 | 129 | |
|
135 | 130 | def notify_reports(self, **kwargs): |
|
136 | 131 | """ |
|
137 | 132 | This should implement notification of reports that occured in 1 min |
|
138 | 133 | interval |
|
139 | 134 | """ |
|
140 | 135 | log.warning('notify_reports NOT IMPLEMENTED') |
|
141 | 136 | |
|
142 | 137 | def notify_alert(self, **kwargs): |
|
143 | 138 | """ |
|
144 | 139 | Notify user of report/uptime/chart threshold events based on events alert |
|
145 | 140 | type |
|
146 | 141 | |
|
147 | 142 | Kwargs: |
|
148 | 143 | application: application that the event applies for, |
|
149 | 144 | event: event that is notified, |
|
150 | 145 | user: user that should be notified |
|
151 | 146 | request: request object |
|
152 | 147 | |
|
153 | 148 | """ |
|
154 | 149 | alert_name = kwargs['event'].unified_alert_name() |
|
155 | 150 | if alert_name in ['slow_report_alert', 'error_report_alert']: |
|
156 | 151 | self.notify_report_alert(**kwargs) |
|
157 | 152 | elif alert_name == 'uptime_alert': |
|
158 | 153 | self.notify_uptime_alert(**kwargs) |
|
159 | 154 | elif alert_name == 'chart_alert': |
|
160 | 155 | self.notify_chart_alert(**kwargs) |
|
161 | 156 | |
|
162 | 157 | def notify_chart_alert(self, **kwargs): |
|
163 | 158 | """ |
|
164 | 159 | This should implement report open/close alerts notifications |
|
165 | 160 | """ |
|
166 | 161 | log.warning('notify_chart_alert NOT IMPLEMENTED') |
|
167 | 162 | |
|
168 | 163 | def notify_report_alert(self, **kwargs): |
|
169 | 164 | """ |
|
170 | 165 | This should implement report open/close alerts notifications |
|
171 | 166 | """ |
|
172 | 167 | log.warning('notify_report_alert NOT IMPLEMENTED') |
|
173 | 168 | |
|
174 | 169 | def notify_uptime_alert(self, **kwargs): |
|
175 | 170 | """ |
|
176 | 171 | This should implement uptime open/close alerts notifications |
|
177 | 172 | """ |
|
178 | 173 | log.warning('notify_uptime_alert NOT IMPLEMENTED') |
|
179 | 174 | |
|
180 | 175 | def get_notification_basic_vars(self, kwargs): |
|
181 | 176 | """ |
|
182 | 177 | Sets most common variables used later for rendering notifications for |
|
183 | 178 | channel |
|
184 | 179 | """ |
|
185 | 180 | if 'event' in kwargs: |
|
186 | 181 | kwargs['since_when'] = kwargs['event'].start_date |
|
187 | 182 | |
|
188 | 183 | url_start_date = kwargs.get('since_when') - timedelta(minutes=1) |
|
189 | 184 | url_end_date = kwargs.get('since_when') + timedelta(minutes=4) |
|
190 | 185 | tmpl_vars = { |
|
191 | 186 | "timestamp": kwargs['since_when'], |
|
192 | 187 | "user": kwargs['user'], |
|
193 | 188 | "since_when": kwargs.get('since_when'), |
|
194 | 189 | "url_start_date": url_start_date, |
|
195 | 190 | "url_end_date": url_end_date |
|
196 | 191 | } |
|
197 | 192 | tmpl_vars["resource_name"] = kwargs['resource'].resource_name |
|
198 | 193 | tmpl_vars["resource"] = kwargs['resource'] |
|
199 | 194 | |
|
200 | 195 | if 'event' in kwargs: |
|
201 | 196 | tmpl_vars['event_values'] = kwargs['event'].values |
|
202 | 197 | tmpl_vars['alert_type'] = kwargs['event'].unified_alert_name() |
|
203 | 198 | tmpl_vars['alert_action'] = kwargs['event'].unified_alert_action() |
|
204 | 199 | return tmpl_vars |
|
205 | 200 | |
|
206 | 201 | def report_alert_notification_vars(self, kwargs): |
|
207 | 202 | tmpl_vars = self.get_notification_basic_vars(kwargs) |
|
208 | 203 | reports = kwargs.get('reports', []) |
|
209 | 204 | tmpl_vars["reports"] = reports |
|
210 | 205 | tmpl_vars["confirmed_total"] = len(reports) |
|
211 | 206 | |
|
212 | 207 | tmpl_vars["report_type"] = "error reports" |
|
213 | 208 | tmpl_vars["url_report_type"] = 'report/list' |
|
214 | 209 | |
|
215 | 210 | alert_type = tmpl_vars.get('alert_type', '') |
|
216 | 211 | if 'slow_report' in alert_type: |
|
217 | 212 | tmpl_vars["report_type"] = "slow reports" |
|
218 | 213 | tmpl_vars["url_report_type"] = 'report/list_slow' |
|
219 | 214 | |
|
220 | 215 | app_url = kwargs['request'].registry.settings['_mail_url'] |
|
221 | 216 | |
|
222 | 217 | destination_url = kwargs['request'].route_url('/', |
|
223 | 218 | _app_url=app_url) |
|
224 | 219 | if alert_type: |
|
225 | 220 | destination_url += 'ui/{}?resource={}&start_date={}&end_date={}'.format( |
|
226 | 221 | tmpl_vars["url_report_type"], |
|
227 | 222 | tmpl_vars['resource'].resource_id, |
|
228 | 223 | tmpl_vars['url_start_date'].strftime(DATE_FRMT), |
|
229 | 224 | tmpl_vars['url_end_date'].strftime(DATE_FRMT) |
|
230 | 225 | ) |
|
231 | 226 | else: |
|
232 | 227 | destination_url += 'ui/{}?resource={}'.format( |
|
233 | 228 | tmpl_vars["url_report_type"], |
|
234 | 229 | tmpl_vars['resource'].resource_id |
|
235 | 230 | ) |
|
236 | 231 | tmpl_vars["destination_url"] = destination_url |
|
237 | 232 | |
|
238 | 233 | return tmpl_vars |
|
239 | 234 | |
|
240 | 235 | def uptime_alert_notification_vars(self, kwargs): |
|
241 | 236 | tmpl_vars = self.get_notification_basic_vars(kwargs) |
|
242 | 237 | app_url = kwargs['request'].registry.settings['_mail_url'] |
|
243 | 238 | destination_url = kwargs['request'].route_url('/', _app_url=app_url) |
|
244 | 239 | destination_url += 'ui/{}?resource={}'.format( |
|
245 | 240 | 'uptime', |
|
246 | 241 | tmpl_vars['resource'].resource_id) |
|
247 | 242 | tmpl_vars['destination_url'] = destination_url |
|
248 | 243 | |
|
249 | 244 | reason = '' |
|
250 | 245 | e_values = tmpl_vars.get('event_values') |
|
251 | 246 | |
|
252 | 247 | if e_values and e_values.get('response_time') == 0: |
|
253 | 248 | reason += ' Response time was slower than 20 seconds.' |
|
254 | 249 | elif e_values: |
|
255 | 250 | code = e_values.get('status_code') |
|
256 | 251 | reason += ' Response status code: %s.' % code |
|
257 | 252 | |
|
258 | 253 | tmpl_vars['reason'] = reason |
|
259 | 254 | return tmpl_vars |
|
260 | 255 | |
|
261 | 256 | def chart_alert_notification_vars(self, kwargs): |
|
262 | 257 | tmpl_vars = self.get_notification_basic_vars(kwargs) |
|
263 | 258 | tmpl_vars['chart_name'] = tmpl_vars['event_values']['chart_name'] |
|
264 | 259 | tmpl_vars['action_name'] = tmpl_vars['event_values'].get( |
|
265 | 260 | 'action_name') or '' |
|
266 | 261 | matched_values = tmpl_vars['event_values']['matched_step_values'] |
|
267 | 262 | tmpl_vars['readable_values'] = [] |
|
268 | 263 | for key, value in list(matched_values['values'].items()): |
|
269 | 264 | matched_label = matched_values['labels'].get(key) |
|
270 | 265 | if matched_label: |
|
271 | 266 | tmpl_vars['readable_values'].append({ |
|
272 | 267 | 'label': matched_label['human_label'], |
|
273 | 268 | 'value': value |
|
274 | 269 | }) |
|
275 | 270 | tmpl_vars['readable_values'] = sorted(tmpl_vars['readable_values'], |
|
276 | 271 | key=lambda x: x['label']) |
|
277 | 272 | start_date = convert_date(tmpl_vars['event_values']['start_interval']) |
|
278 | 273 | end_date = None |
|
279 | 274 | if tmpl_vars['event_values'].get('end_interval'): |
|
280 | 275 | end_date = convert_date(tmpl_vars['event_values']['end_interval']) |
|
281 | 276 | |
|
282 | 277 | app_url = kwargs['request'].registry.settings['_mail_url'] |
|
283 | 278 | destination_url = kwargs['request'].route_url('/', _app_url=app_url) |
|
284 | 279 | to_encode = { |
|
285 | 280 | 'resource': tmpl_vars['event_values']['resource'], |
|
286 | 281 | 'start_date': start_date.strftime(DATE_FRMT), |
|
287 | 282 | } |
|
288 | 283 | if end_date: |
|
289 | 284 | to_encode['end_date'] = end_date.strftime(DATE_FRMT) |
|
290 | 285 | |
|
291 | 286 | destination_url += 'ui/{}?{}'.format( |
|
292 | 287 | 'logs', |
|
293 | 288 | urllib.parse.urlencode(to_encode) |
|
294 | 289 | ) |
|
295 | 290 | tmpl_vars['destination_url'] = destination_url |
|
296 | 291 | return tmpl_vars |
@@ -1,84 +1,79 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import sqlalchemy as sa |
|
23 | 18 | |
|
24 | 19 | from appenlight.models.resource import Resource |
|
25 | 20 | from appenlight.models import Base, get_db_session |
|
26 | 21 | from sqlalchemy.orm import validates |
|
27 | 22 | from ziggurat_foundations.models.base import BaseModel |
|
28 | 23 | |
|
29 | 24 | |
|
30 | 25 | class AlertChannelAction(Base, BaseModel): |
|
31 | 26 | """ |
|
32 | 27 | Stores notifications conditions for user's alert channels |
|
33 | 28 | This is later used for rule parsing like "alert if http_status == 500" |
|
34 | 29 | """ |
|
35 | 30 | __tablename__ = 'alert_channels_actions' |
|
36 | 31 | |
|
37 | 32 | types = ['report', 'chart'] |
|
38 | 33 | |
|
39 | 34 | owner_id = sa.Column(sa.Integer, |
|
40 | 35 | sa.ForeignKey('users.id', onupdate='CASCADE', |
|
41 | 36 | ondelete='CASCADE')) |
|
42 | 37 | resource_id = sa.Column(sa.Integer()) |
|
43 | 38 | action = sa.Column(sa.Unicode(10), nullable=False, default='always') |
|
44 | 39 | type = sa.Column(sa.Unicode(10), nullable=False) |
|
45 | 40 | other_id = sa.Column(sa.Unicode(40)) |
|
46 | 41 | pkey = sa.Column(sa.Integer(), nullable=False, primary_key=True) |
|
47 | 42 | rule = sa.Column(sa.dialects.postgresql.JSON, |
|
48 | 43 | nullable=False, default={'field': 'http_status', |
|
49 | 44 | "op": "ge", "value": "500"}) |
|
50 | 45 | config = sa.Column(sa.dialects.postgresql.JSON) |
|
51 | 46 | name = sa.Column(sa.Unicode(255)) |
|
52 | 47 | |
|
53 | 48 | @validates('notify_type') |
|
54 | 49 | def validate_email(self, key, notify_type): |
|
55 | 50 | assert notify_type in ['always', 'only_first'] |
|
56 | 51 | return notify_type |
|
57 | 52 | |
|
58 | 53 | def resource_name(self, db_session=None): |
|
59 | 54 | db_session = get_db_session(db_session) |
|
60 | 55 | if self.resource_id: |
|
61 | 56 | return Resource.by_resource_id(self.resource_id, |
|
62 | 57 | db_session=db_session).resource_name |
|
63 | 58 | else: |
|
64 | 59 | return 'any resource' |
|
65 | 60 | |
|
66 | 61 | def get_dict(self, exclude_keys=None, include_keys=None, |
|
67 | 62 | extended_info=False): |
|
68 | 63 | """ |
|
69 | 64 | Returns dictionary with required information that will be consumed by |
|
70 | 65 | angular |
|
71 | 66 | """ |
|
72 | 67 | instance_dict = super(AlertChannelAction, self).get_dict() |
|
73 | 68 | exclude_keys_list = exclude_keys or [] |
|
74 | 69 | include_keys_list = include_keys or [] |
|
75 | 70 | if extended_info: |
|
76 | 71 | instance_dict['channels'] = [ |
|
77 | 72 | c.get_dict(extended_info=False) for c in self.channels] |
|
78 | 73 | |
|
79 | 74 | d = {} |
|
80 | 75 | for k in instance_dict.keys(): |
|
81 | 76 | if (k not in exclude_keys_list and |
|
82 | 77 | (k in include_keys_list or not include_keys)): |
|
83 | 78 | d[k] = instance_dict[k] |
|
84 | 79 | return d |
@@ -1,21 +1,16 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 |
@@ -1,193 +1,188 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import logging |
|
23 | 18 | from appenlight.models.alert_channel import AlertChannel |
|
24 | 19 | from appenlight.models.integrations.campfire import CampfireIntegration |
|
25 | 20 | from webhelpers2.text import truncate |
|
26 | 21 | |
|
27 | 22 | log = logging.getLogger(__name__) |
|
28 | 23 | |
|
29 | 24 | |
|
30 | 25 | class CampfireAlertChannel(AlertChannel): |
|
31 | 26 | __mapper_args__ = { |
|
32 | 27 | 'polymorphic_identity': 'campfire' |
|
33 | 28 | } |
|
34 | 29 | |
|
35 | 30 | @property |
|
36 | 31 | def client(self): |
|
37 | 32 | client = CampfireIntegration.create_client( |
|
38 | 33 | self.integration.config['api_token'], |
|
39 | 34 | self.integration.config['account']) |
|
40 | 35 | return client |
|
41 | 36 | |
|
42 | 37 | def notify_reports(self, **kwargs): |
|
43 | 38 | """ |
|
44 | 39 | Notify user of individual reports |
|
45 | 40 | |
|
46 | 41 | kwargs: |
|
47 | 42 | application: application that the event applies for, |
|
48 | 43 | user: user that should be notified |
|
49 | 44 | request: request object |
|
50 | 45 | since_when: reports are newer than this time value, |
|
51 | 46 | reports: list of reports to render |
|
52 | 47 | |
|
53 | 48 | """ |
|
54 | 49 | template_vars = self.report_alert_notification_vars(kwargs) |
|
55 | 50 | |
|
56 | 51 | app_url = kwargs['request'].registry.settings['_mail_url'] |
|
57 | 52 | destination_url = kwargs['request'].route_url('/', |
|
58 | 53 | app_url=app_url) |
|
59 | 54 | f_args = ('report', |
|
60 | 55 | template_vars['resource'].resource_id, |
|
61 | 56 | template_vars['url_start_date'].strftime('%Y-%m-%dT%H:%M'), |
|
62 | 57 | template_vars['url_end_date'].strftime('%Y-%m-%dT%H:%M')) |
|
63 | 58 | destination_url += 'ui/{}?resource={}&start_date={}&end_date={}'.format( |
|
64 | 59 | *f_args) |
|
65 | 60 | |
|
66 | 61 | if template_vars['confirmed_total'] > 1: |
|
67 | 62 | template_vars["title"] = "%s - %s reports" % ( |
|
68 | 63 | template_vars['resource_name'], |
|
69 | 64 | template_vars['confirmed_total'], |
|
70 | 65 | ) |
|
71 | 66 | else: |
|
72 | 67 | error_title = truncate(template_vars['reports'][0][1].error or |
|
73 | 68 | 'slow report', 90) |
|
74 | 69 | template_vars["title"] = "%s - '%s' report" % ( |
|
75 | 70 | template_vars['resource_name'], |
|
76 | 71 | error_title) |
|
77 | 72 | |
|
78 | 73 | template_vars["title"] += ' ' + destination_url |
|
79 | 74 | |
|
80 | 75 | log_msg = 'NOTIFY : %s via %s :: %s reports' % ( |
|
81 | 76 | kwargs['user'].user_name, |
|
82 | 77 | self.channel_visible_value, |
|
83 | 78 | template_vars['confirmed_total']) |
|
84 | 79 | log.warning(log_msg) |
|
85 | 80 | |
|
86 | 81 | for room in self.integration.config['rooms'].split(','): |
|
87 | 82 | self.client.speak_to_room(room.strip(), template_vars["title"]) |
|
88 | 83 | |
|
89 | 84 | def notify_report_alert(self, **kwargs): |
|
90 | 85 | """ |
|
91 | 86 | Build and send report alert notification |
|
92 | 87 | |
|
93 | 88 | Kwargs: |
|
94 | 89 | application: application that the event applies for, |
|
95 | 90 | event: event that is notified, |
|
96 | 91 | user: user that should be notified |
|
97 | 92 | request: request object |
|
98 | 93 | |
|
99 | 94 | """ |
|
100 | 95 | template_vars = self.report_alert_notification_vars(kwargs) |
|
101 | 96 | |
|
102 | 97 | if kwargs['event'].unified_alert_action() == 'OPEN': |
|
103 | 98 | title = 'ALERT %s: %s - %s %s %s' % ( |
|
104 | 99 | template_vars['alert_action'], |
|
105 | 100 | template_vars['resource_name'], |
|
106 | 101 | kwargs['event'].values['reports'], |
|
107 | 102 | template_vars['report_type'], |
|
108 | 103 | template_vars['destination_url'] |
|
109 | 104 | ) |
|
110 | 105 | |
|
111 | 106 | else: |
|
112 | 107 | title = 'ALERT %s: %s type: %s' % ( |
|
113 | 108 | template_vars['alert_action'], |
|
114 | 109 | template_vars['resource_name'], |
|
115 | 110 | template_vars['alert_type'].replace('_', ' '), |
|
116 | 111 | ) |
|
117 | 112 | for room in self.integration.config['rooms'].split(','): |
|
118 | 113 | self.client.speak_to_room(room.strip(), title, sound='VUVUZELA') |
|
119 | 114 | |
|
120 | 115 | def notify_uptime_alert(self, **kwargs): |
|
121 | 116 | """ |
|
122 | 117 | Build and send uptime alert notification |
|
123 | 118 | |
|
124 | 119 | Kwargs: |
|
125 | 120 | application: application that the event applies for, |
|
126 | 121 | event: event that is notified, |
|
127 | 122 | user: user that should be notified |
|
128 | 123 | request: request object |
|
129 | 124 | |
|
130 | 125 | """ |
|
131 | 126 | template_vars = self.uptime_alert_notification_vars(kwargs) |
|
132 | 127 | |
|
133 | 128 | message = 'ALERT %s: %s has uptime issues %s\n\n' % ( |
|
134 | 129 | template_vars['alert_action'], |
|
135 | 130 | template_vars['resource_name'], |
|
136 | 131 | template_vars['destination_url'] |
|
137 | 132 | ) |
|
138 | 133 | message += template_vars['reason'] |
|
139 | 134 | |
|
140 | 135 | for room in self.integration.config['rooms'].split(','): |
|
141 | 136 | self.client.speak_to_room(room.strip(), message, sound='VUVUZELA') |
|
142 | 137 | |
|
143 | 138 | def send_digest(self, **kwargs): |
|
144 | 139 | """ |
|
145 | 140 | Build and send daily digest notification |
|
146 | 141 | |
|
147 | 142 | kwargs: |
|
148 | 143 | application: application that the event applies for, |
|
149 | 144 | user: user that should be notified |
|
150 | 145 | request: request object |
|
151 | 146 | since_when: reports are newer than this time value, |
|
152 | 147 | reports: list of reports to render |
|
153 | 148 | |
|
154 | 149 | """ |
|
155 | 150 | template_vars = self.report_alert_notification_vars(kwargs) |
|
156 | 151 | f_args = (template_vars['resource_name'], |
|
157 | 152 | template_vars['confirmed_total'],) |
|
158 | 153 | message = "Daily report digest: %s - %s reports" % f_args |
|
159 | 154 | message += '{}\n'.format(template_vars['destination_url']) |
|
160 | 155 | for room in self.integration.config['rooms'].split(','): |
|
161 | 156 | self.client.speak_to_room(room.strip(), message) |
|
162 | 157 | |
|
163 | 158 | log_msg = 'DIGEST : %s via %s :: %s reports' % ( |
|
164 | 159 | kwargs['user'].user_name, |
|
165 | 160 | self.channel_visible_value, |
|
166 | 161 | template_vars['confirmed_total']) |
|
167 | 162 | log.warning(log_msg) |
|
168 | 163 | |
|
169 | 164 | def notify_chart_alert(self, **kwargs): |
|
170 | 165 | """ |
|
171 | 166 | Build and send chart alert notification |
|
172 | 167 | |
|
173 | 168 | Kwargs: |
|
174 | 169 | application: application that the event applies for, |
|
175 | 170 | event: event that is notified, |
|
176 | 171 | user: user that should be notified |
|
177 | 172 | request: request object |
|
178 | 173 | |
|
179 | 174 | """ |
|
180 | 175 | template_vars = self.chart_alert_notification_vars(kwargs) |
|
181 | 176 | message = 'ALERT {}: value in "{}" chart: ' \ |
|
182 | 177 | 'met alert "{}" criteria {} \n'.format( |
|
183 | 178 | template_vars['alert_action'], |
|
184 | 179 | template_vars['chart_name'], |
|
185 | 180 | template_vars['action_name'], |
|
186 | 181 | template_vars['destination_url'] |
|
187 | 182 | ) |
|
188 | 183 | |
|
189 | 184 | for item in template_vars['readable_values']: |
|
190 | 185 | message += '{}: {}\n'.format(item['label'], item['value']) |
|
191 | 186 | |
|
192 | 187 | for room in self.integration.config['rooms'].split(','): |
|
193 | 188 | self.client.speak_to_room(room.strip(), message, sound='VUVUZELA') |
@@ -1,180 +1,175 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import logging |
|
23 | 18 | from appenlight.models.alert_channel import AlertChannel |
|
24 | 19 | from appenlight.models.services.user import UserService |
|
25 | 20 | from webhelpers2.text import truncate |
|
26 | 21 | |
|
27 | 22 | log = logging.getLogger(__name__) |
|
28 | 23 | |
|
29 | 24 | |
|
30 | 25 | class EmailAlertChannel(AlertChannel): |
|
31 | 26 | """ |
|
32 | 27 | Default email alerting channel |
|
33 | 28 | """ |
|
34 | 29 | |
|
35 | 30 | __mapper_args__ = { |
|
36 | 31 | 'polymorphic_identity': 'email' |
|
37 | 32 | } |
|
38 | 33 | |
|
39 | 34 | def notify_reports(self, **kwargs): |
|
40 | 35 | """ |
|
41 | 36 | Notify user of individual reports |
|
42 | 37 | |
|
43 | 38 | kwargs: |
|
44 | 39 | application: application that the event applies for, |
|
45 | 40 | user: user that should be notified |
|
46 | 41 | request: request object |
|
47 | 42 | since_when: reports are newer than this time value, |
|
48 | 43 | reports: list of reports to render |
|
49 | 44 | |
|
50 | 45 | """ |
|
51 | 46 | template_vars = self.report_alert_notification_vars(kwargs) |
|
52 | 47 | |
|
53 | 48 | if template_vars['confirmed_total'] > 1: |
|
54 | 49 | template_vars["title"] = "AppEnlight :: %s - %s reports" % ( |
|
55 | 50 | template_vars['resource_name'], |
|
56 | 51 | template_vars['confirmed_total'], |
|
57 | 52 | ) |
|
58 | 53 | else: |
|
59 | 54 | error_title = truncate(template_vars['reports'][0][1].error or |
|
60 | 55 | 'slow report', 20) |
|
61 | 56 | template_vars["title"] = "AppEnlight :: %s - '%s' report" % ( |
|
62 | 57 | template_vars['resource_name'], |
|
63 | 58 | error_title) |
|
64 | 59 | UserService.send_email(kwargs['request'], |
|
65 | 60 | [self.channel_value], |
|
66 | 61 | template_vars, |
|
67 | 62 | '/email_templates/notify_reports.jinja2') |
|
68 | 63 | log_msg = 'NOTIFY : %s via %s :: %s reports' % ( |
|
69 | 64 | kwargs['user'].user_name, |
|
70 | 65 | self.channel_visible_value, |
|
71 | 66 | template_vars['confirmed_total']) |
|
72 | 67 | log.warning(log_msg) |
|
73 | 68 | |
|
74 | 69 | def send_digest(self, **kwargs): |
|
75 | 70 | """ |
|
76 | 71 | Build and send daily digest notification |
|
77 | 72 | |
|
78 | 73 | kwargs: |
|
79 | 74 | application: application that the event applies for, |
|
80 | 75 | user: user that should be notified |
|
81 | 76 | request: request object |
|
82 | 77 | since_when: reports are newer than this time value, |
|
83 | 78 | reports: list of reports to render |
|
84 | 79 | |
|
85 | 80 | """ |
|
86 | 81 | template_vars = self.report_alert_notification_vars(kwargs) |
|
87 | 82 | title = "AppEnlight :: Daily report digest: %s - %s reports" |
|
88 | 83 | template_vars["email_title"] = title % ( |
|
89 | 84 | template_vars['resource_name'], |
|
90 | 85 | template_vars['confirmed_total'], |
|
91 | 86 | ) |
|
92 | 87 | |
|
93 | 88 | UserService.send_email(kwargs['request'], |
|
94 | 89 | [self.channel_value], |
|
95 | 90 | template_vars, |
|
96 | 91 | '/email_templates/notify_reports.jinja2', |
|
97 | 92 | immediately=True, |
|
98 | 93 | silent=True) |
|
99 | 94 | log_msg = 'DIGEST : %s via %s :: %s reports' % ( |
|
100 | 95 | kwargs['user'].user_name, |
|
101 | 96 | self.channel_visible_value, |
|
102 | 97 | template_vars['confirmed_total']) |
|
103 | 98 | log.warning(log_msg) |
|
104 | 99 | |
|
105 | 100 | def notify_report_alert(self, **kwargs): |
|
106 | 101 | """ |
|
107 | 102 | Build and send report alert notification |
|
108 | 103 | |
|
109 | 104 | Kwargs: |
|
110 | 105 | application: application that the event applies for, |
|
111 | 106 | event: event that is notified, |
|
112 | 107 | user: user that should be notified |
|
113 | 108 | request: request object |
|
114 | 109 | |
|
115 | 110 | """ |
|
116 | 111 | template_vars = self.report_alert_notification_vars(kwargs) |
|
117 | 112 | |
|
118 | 113 | if kwargs['event'].unified_alert_action() == 'OPEN': |
|
119 | 114 | title = 'AppEnlight :: ALERT %s: %s - %s %s' % ( |
|
120 | 115 | template_vars['alert_action'], |
|
121 | 116 | template_vars['resource_name'], |
|
122 | 117 | kwargs['event'].values['reports'], |
|
123 | 118 | template_vars['report_type'], |
|
124 | 119 | ) |
|
125 | 120 | else: |
|
126 | 121 | title = 'AppEnlight :: ALERT %s: %s type: %s' % ( |
|
127 | 122 | template_vars['alert_action'], |
|
128 | 123 | template_vars['resource_name'], |
|
129 | 124 | template_vars['alert_type'].replace('_', ' '), |
|
130 | 125 | ) |
|
131 | 126 | template_vars['email_title'] = title |
|
132 | 127 | UserService.send_email(kwargs['request'], [self.channel_value], |
|
133 | 128 | template_vars, |
|
134 | 129 | '/email_templates/alert_reports.jinja2') |
|
135 | 130 | |
|
136 | 131 | def notify_uptime_alert(self, **kwargs): |
|
137 | 132 | """ |
|
138 | 133 | Build and send uptime alert notification |
|
139 | 134 | |
|
140 | 135 | Kwargs: |
|
141 | 136 | application: application that the event applies for, |
|
142 | 137 | event: event that is notified, |
|
143 | 138 | user: user that should be notified |
|
144 | 139 | request: request object |
|
145 | 140 | |
|
146 | 141 | """ |
|
147 | 142 | template_vars = self.uptime_alert_notification_vars(kwargs) |
|
148 | 143 | title = 'AppEnlight :: ALERT %s: %s has uptime issues' % ( |
|
149 | 144 | template_vars['alert_action'], |
|
150 | 145 | template_vars['resource_name'], |
|
151 | 146 | ) |
|
152 | 147 | template_vars['email_title'] = title |
|
153 | 148 | |
|
154 | 149 | UserService.send_email(kwargs['request'], [self.channel_value], |
|
155 | 150 | template_vars, |
|
156 | 151 | '/email_templates/alert_uptime.jinja2') |
|
157 | 152 | |
|
158 | 153 | def notify_chart_alert(self, **kwargs): |
|
159 | 154 | """ |
|
160 | 155 | Build and send chart alert notification |
|
161 | 156 | |
|
162 | 157 | Kwargs: |
|
163 | 158 | application: application that the event applies for, |
|
164 | 159 | event: event that is notified, |
|
165 | 160 | user: user that should be notified |
|
166 | 161 | request: request object |
|
167 | 162 | |
|
168 | 163 | """ |
|
169 | 164 | template_vars = self.chart_alert_notification_vars(kwargs) |
|
170 | 165 | |
|
171 | 166 | title = 'AppEnlight :: ALERT {} value in "{}" chart' \ |
|
172 | 167 | ' met alert "{}" criteria'.format( |
|
173 | 168 | template_vars['alert_action'], |
|
174 | 169 | template_vars['chart_name'], |
|
175 | 170 | template_vars['action_name'], |
|
176 | 171 | ) |
|
177 | 172 | template_vars['email_title'] = title |
|
178 | 173 | UserService.send_email(kwargs['request'], [self.channel_value], |
|
179 | 174 | template_vars, |
|
180 | 175 | '/email_templates/alert_chart.jinja2') |
@@ -1,238 +1,233 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import logging |
|
23 | 18 | from appenlight.models.alert_channel import AlertChannel |
|
24 | 19 | from appenlight.models.integrations.flowdock import FlowdockIntegration |
|
25 | 20 | from webhelpers2.text import truncate |
|
26 | 21 | |
|
27 | 22 | log = logging.getLogger(__name__) |
|
28 | 23 | |
|
29 | 24 | |
|
30 | 25 | class FlowdockAlertChannel(AlertChannel): |
|
31 | 26 | __mapper_args__ = { |
|
32 | 27 | 'polymorphic_identity': 'flowdock' |
|
33 | 28 | } |
|
34 | 29 | |
|
35 | 30 | def notify_reports(self, **kwargs): |
|
36 | 31 | """ |
|
37 | 32 | Notify user of individual reports |
|
38 | 33 | |
|
39 | 34 | kwargs: |
|
40 | 35 | application: application that the event applies for, |
|
41 | 36 | user: user that should be notified |
|
42 | 37 | request: request object |
|
43 | 38 | since_when: reports are newer than this time value, |
|
44 | 39 | reports: list of reports to render |
|
45 | 40 | |
|
46 | 41 | """ |
|
47 | 42 | template_vars = self.report_alert_notification_vars(kwargs) |
|
48 | 43 | |
|
49 | 44 | app_url = kwargs['request'].registry.settings['_mail_url'] |
|
50 | 45 | destination_url = kwargs['request'].route_url('/', |
|
51 | 46 | _app_url=app_url) |
|
52 | 47 | f_args = ('report', |
|
53 | 48 | template_vars['resource'].resource_id, |
|
54 | 49 | template_vars['url_start_date'].strftime('%Y-%m-%dT%H:%M'), |
|
55 | 50 | template_vars['url_end_date'].strftime('%Y-%m-%dT%H:%M')) |
|
56 | 51 | destination_url += 'ui/{}?resource={}&start_date={}&end_date={}'.format( |
|
57 | 52 | *f_args) |
|
58 | 53 | |
|
59 | 54 | if template_vars['confirmed_total'] > 1: |
|
60 | 55 | template_vars["title"] = "%s - %s reports" % ( |
|
61 | 56 | template_vars['resource_name'], |
|
62 | 57 | template_vars['confirmed_total'], |
|
63 | 58 | ) |
|
64 | 59 | else: |
|
65 | 60 | error_title = truncate(template_vars['reports'][0][1].error or |
|
66 | 61 | 'slow report', 90) |
|
67 | 62 | template_vars["title"] = "%s - '%s' report" % ( |
|
68 | 63 | template_vars['resource_name'], |
|
69 | 64 | error_title) |
|
70 | 65 | |
|
71 | 66 | log_msg = 'NOTIFY : %s via %s :: %s reports' % ( |
|
72 | 67 | kwargs['user'].user_name, |
|
73 | 68 | self.channel_visible_value, |
|
74 | 69 | template_vars['confirmed_total']) |
|
75 | 70 | log.warning(log_msg) |
|
76 | 71 | |
|
77 | 72 | client = FlowdockIntegration.create_client( |
|
78 | 73 | self.integration.config['api_token']) |
|
79 | 74 | payload = { |
|
80 | 75 | "source": "AppEnlight", |
|
81 | 76 | "from_address": kwargs['request'].registry.settings[ |
|
82 | 77 | 'mailing.from_email'], |
|
83 | 78 | "subject": template_vars["title"], |
|
84 | 79 | "content": "New report present", |
|
85 | 80 | "tags": ["appenlight"], |
|
86 | 81 | "link": destination_url |
|
87 | 82 | } |
|
88 | 83 | client.send_to_inbox(payload) |
|
89 | 84 | |
|
90 | 85 | def notify_report_alert(self, **kwargs): |
|
91 | 86 | """ |
|
92 | 87 | Build and send report alert notification |
|
93 | 88 | |
|
94 | 89 | Kwargs: |
|
95 | 90 | application: application that the event applies for, |
|
96 | 91 | event: event that is notified, |
|
97 | 92 | user: user that should be notified |
|
98 | 93 | request: request object |
|
99 | 94 | |
|
100 | 95 | """ |
|
101 | 96 | template_vars = self.report_alert_notification_vars(kwargs) |
|
102 | 97 | |
|
103 | 98 | if kwargs['event'].unified_alert_action() == 'OPEN': |
|
104 | 99 | |
|
105 | 100 | title = 'ALERT %s: %s - %s %s' % ( |
|
106 | 101 | template_vars['alert_action'], |
|
107 | 102 | template_vars['resource_name'], |
|
108 | 103 | kwargs['event'].values['reports'], |
|
109 | 104 | template_vars['report_type'], |
|
110 | 105 | ) |
|
111 | 106 | |
|
112 | 107 | else: |
|
113 | 108 | title = 'ALERT %s: %s type: %s' % ( |
|
114 | 109 | template_vars['alert_action'], |
|
115 | 110 | template_vars['resource_name'], |
|
116 | 111 | template_vars['alert_type'].replace('_', ' '), |
|
117 | 112 | ) |
|
118 | 113 | |
|
119 | 114 | client = FlowdockIntegration.create_client( |
|
120 | 115 | self.integration.config['api_token']) |
|
121 | 116 | payload = { |
|
122 | 117 | "source": "AppEnlight", |
|
123 | 118 | "from_address": kwargs['request'].registry.settings[ |
|
124 | 119 | 'mailing.from_email'], |
|
125 | 120 | "subject": title, |
|
126 | 121 | "content": 'Investigation required', |
|
127 | 122 | "tags": ["appenlight", "alert", template_vars['alert_type']], |
|
128 | 123 | "link": template_vars['destination_url'] |
|
129 | 124 | } |
|
130 | 125 | client.send_to_inbox(payload) |
|
131 | 126 | |
|
132 | 127 | def notify_uptime_alert(self, **kwargs): |
|
133 | 128 | """ |
|
134 | 129 | Build and send uptime alert notification |
|
135 | 130 | |
|
136 | 131 | Kwargs: |
|
137 | 132 | application: application that the event applies for, |
|
138 | 133 | event: event that is notified, |
|
139 | 134 | user: user that should be notified |
|
140 | 135 | request: request object |
|
141 | 136 | |
|
142 | 137 | """ |
|
143 | 138 | template_vars = self.uptime_alert_notification_vars(kwargs) |
|
144 | 139 | |
|
145 | 140 | message = 'ALERT %s: %s has uptime issues' % ( |
|
146 | 141 | template_vars['alert_action'], |
|
147 | 142 | template_vars['resource_name'], |
|
148 | 143 | ) |
|
149 | 144 | submessage = 'Info: ' |
|
150 | 145 | submessage += template_vars['reason'] |
|
151 | 146 | |
|
152 | 147 | client = FlowdockIntegration.create_client( |
|
153 | 148 | self.integration.config['api_token']) |
|
154 | 149 | payload = { |
|
155 | 150 | "source": "AppEnlight", |
|
156 | 151 | "from_address": kwargs['request'].registry.settings[ |
|
157 | 152 | 'mailing.from_email'], |
|
158 | 153 | "subject": message, |
|
159 | 154 | "content": submessage, |
|
160 | 155 | "tags": ["appenlight", "alert", 'uptime'], |
|
161 | 156 | "link": template_vars['destination_url'] |
|
162 | 157 | } |
|
163 | 158 | client.send_to_inbox(payload) |
|
164 | 159 | |
|
165 | 160 | def send_digest(self, **kwargs): |
|
166 | 161 | """ |
|
167 | 162 | Build and send daily digest notification |
|
168 | 163 | |
|
169 | 164 | kwargs: |
|
170 | 165 | application: application that the event applies for, |
|
171 | 166 | user: user that should be notified |
|
172 | 167 | request: request object |
|
173 | 168 | since_when: reports are newer than this time value, |
|
174 | 169 | reports: list of reports to render |
|
175 | 170 | |
|
176 | 171 | """ |
|
177 | 172 | template_vars = self.report_alert_notification_vars(kwargs) |
|
178 | 173 | message = "Daily report digest: %s - %s reports" % ( |
|
179 | 174 | template_vars['resource_name'], template_vars['confirmed_total']) |
|
180 | 175 | |
|
181 | 176 | f_args = (template_vars['confirmed_total'], |
|
182 | 177 | template_vars['timestamp']) |
|
183 | 178 | |
|
184 | 179 | payload = { |
|
185 | 180 | "source": "AppEnlight", |
|
186 | 181 | "from_address": kwargs['request'].registry.settings[ |
|
187 | 182 | 'mailing.from_email'], |
|
188 | 183 | "subject": message, |
|
189 | 184 | "content": '%s reports in total since %s' % f_args, |
|
190 | 185 | "tags": ["appenlight", "digest"], |
|
191 | 186 | "link": template_vars['destination_url'] |
|
192 | 187 | } |
|
193 | 188 | |
|
194 | 189 | client = FlowdockIntegration.create_client( |
|
195 | 190 | self.integration.config['api_token']) |
|
196 | 191 | client.send_to_inbox(payload) |
|
197 | 192 | |
|
198 | 193 | log_msg = 'DIGEST : %s via %s :: %s reports' % ( |
|
199 | 194 | kwargs['user'].user_name, |
|
200 | 195 | self.channel_visible_value, |
|
201 | 196 | template_vars['confirmed_total']) |
|
202 | 197 | log.warning(log_msg) |
|
203 | 198 | |
|
204 | 199 | def notify_chart_alert(self, **kwargs): |
|
205 | 200 | """ |
|
206 | 201 | Build and send chart alert notification |
|
207 | 202 | |
|
208 | 203 | Kwargs: |
|
209 | 204 | application: application that the event applies for, |
|
210 | 205 | event: event that is notified, |
|
211 | 206 | user: user that should be notified |
|
212 | 207 | request: request object |
|
213 | 208 | |
|
214 | 209 | """ |
|
215 | 210 | template_vars = self.chart_alert_notification_vars(kwargs) |
|
216 | 211 | |
|
217 | 212 | message = 'ALERT {}: value in "{}" chart ' \ |
|
218 | 213 | 'met alert "{}" criteria'.format( |
|
219 | 214 | template_vars['alert_action'], |
|
220 | 215 | template_vars['chart_name'], |
|
221 | 216 | template_vars['action_name'], |
|
222 | 217 | ) |
|
223 | 218 | submessage = 'Info: ' |
|
224 | 219 | for item in template_vars['readable_values']: |
|
225 | 220 | submessage += '{}: {}\n'.format(item['label'], item['value']) |
|
226 | 221 | |
|
227 | 222 | client = FlowdockIntegration.create_client( |
|
228 | 223 | self.integration.config['api_token']) |
|
229 | 224 | payload = { |
|
230 | 225 | "source": "AppEnlight", |
|
231 | 226 | "from_address": kwargs['request'].registry.settings[ |
|
232 | 227 | 'mailing.from_email'], |
|
233 | 228 | "subject": message, |
|
234 | 229 | "content": submessage, |
|
235 | 230 | "tags": ["appenlight", "alert", 'chart'], |
|
236 | 231 | "link": template_vars['destination_url'] |
|
237 | 232 | } |
|
238 | 233 | client.send_to_inbox(payload) |
@@ -1,234 +1,229 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import logging |
|
23 | 18 | from appenlight.models.alert_channel import AlertChannel |
|
24 | 19 | from appenlight.models.integrations.hipchat import HipchatIntegration |
|
25 | 20 | from webhelpers2.text import truncate |
|
26 | 21 | |
|
27 | 22 | log = logging.getLogger(__name__) |
|
28 | 23 | |
|
29 | 24 | |
|
30 | 25 | class HipchatAlertChannel(AlertChannel): |
|
31 | 26 | __mapper_args__ = { |
|
32 | 27 | 'polymorphic_identity': 'hipchat' |
|
33 | 28 | } |
|
34 | 29 | |
|
35 | 30 | def notify_reports(self, **kwargs): |
|
36 | 31 | """ |
|
37 | 32 | Notify user of individual reports |
|
38 | 33 | |
|
39 | 34 | kwargs: |
|
40 | 35 | application: application that the event applies for, |
|
41 | 36 | user: user that should be notified |
|
42 | 37 | request: request object |
|
43 | 38 | since_when: reports are newer than this time value, |
|
44 | 39 | reports: list of reports to render |
|
45 | 40 | |
|
46 | 41 | """ |
|
47 | 42 | template_vars = self.report_alert_notification_vars(kwargs) |
|
48 | 43 | |
|
49 | 44 | app_url = kwargs['request'].registry.settings['_mail_url'] |
|
50 | 45 | destination_url = kwargs['request'].route_url('/', |
|
51 | 46 | _app_url=app_url) |
|
52 | 47 | f_args = ('report', |
|
53 | 48 | template_vars['resource'].resource_id, |
|
54 | 49 | template_vars['url_start_date'].strftime('%Y-%m-%dT%H:%M'), |
|
55 | 50 | template_vars['url_end_date'].strftime('%Y-%m-%dT%H:%M')) |
|
56 | 51 | destination_url += 'ui/{}?resource={}&start_date={}&end_date={}'.format( |
|
57 | 52 | *f_args) |
|
58 | 53 | |
|
59 | 54 | if template_vars['confirmed_total'] > 1: |
|
60 | 55 | template_vars["title"] = "%s - %s reports" % ( |
|
61 | 56 | template_vars['resource_name'], |
|
62 | 57 | template_vars['confirmed_total'], |
|
63 | 58 | ) |
|
64 | 59 | else: |
|
65 | 60 | error_title = truncate(template_vars['reports'][0][1].error or |
|
66 | 61 | 'slow report', 90) |
|
67 | 62 | template_vars["title"] = "%s - '%s' report" % ( |
|
68 | 63 | template_vars['resource_name'], |
|
69 | 64 | error_title) |
|
70 | 65 | |
|
71 | 66 | template_vars["title"] += ' ' + destination_url |
|
72 | 67 | |
|
73 | 68 | log_msg = 'NOTIFY : %s via %s :: %s reports' % ( |
|
74 | 69 | kwargs['user'].user_name, |
|
75 | 70 | self.channel_visible_value, |
|
76 | 71 | template_vars['confirmed_total']) |
|
77 | 72 | log.warning(log_msg) |
|
78 | 73 | |
|
79 | 74 | client = HipchatIntegration.create_client( |
|
80 | 75 | self.integration.config['api_token']) |
|
81 | 76 | for room in self.integration.config['rooms'].split(','): |
|
82 | 77 | client.send({ |
|
83 | 78 | "message_format": 'text', |
|
84 | 79 | "message": template_vars["title"], |
|
85 | 80 | "from": "AppEnlight", |
|
86 | 81 | "room_id": room.strip(), |
|
87 | 82 | "color": "yellow" |
|
88 | 83 | }) |
|
89 | 84 | |
|
90 | 85 | def notify_report_alert(self, **kwargs): |
|
91 | 86 | """ |
|
92 | 87 | Build and send report alert notification |
|
93 | 88 | |
|
94 | 89 | Kwargs: |
|
95 | 90 | application: application that the event applies for, |
|
96 | 91 | event: event that is notified, |
|
97 | 92 | user: user that should be notified |
|
98 | 93 | request: request object |
|
99 | 94 | |
|
100 | 95 | """ |
|
101 | 96 | template_vars = self.report_alert_notification_vars(kwargs) |
|
102 | 97 | |
|
103 | 98 | if kwargs['event'].unified_alert_action() == 'OPEN': |
|
104 | 99 | |
|
105 | 100 | title = 'ALERT %s: %s - %s %s' % ( |
|
106 | 101 | template_vars['alert_action'], |
|
107 | 102 | template_vars['resource_name'], |
|
108 | 103 | kwargs['event'].values['reports'], |
|
109 | 104 | template_vars['report_type'], |
|
110 | 105 | ) |
|
111 | 106 | |
|
112 | 107 | else: |
|
113 | 108 | title = 'ALERT %s: %s type: %s' % ( |
|
114 | 109 | template_vars['alert_action'], |
|
115 | 110 | template_vars['resource_name'], |
|
116 | 111 | template_vars['alert_type'].replace('_', ' '), |
|
117 | 112 | ) |
|
118 | 113 | |
|
119 | 114 | title += '\n ' + template_vars['destination_url'] |
|
120 | 115 | |
|
121 | 116 | api_token = self.integration.config['api_token'] |
|
122 | 117 | client = HipchatIntegration.create_client(api_token) |
|
123 | 118 | for room in self.integration.config['rooms'].split(','): |
|
124 | 119 | client.send({ |
|
125 | 120 | "message_format": 'text', |
|
126 | 121 | "message": title, |
|
127 | 122 | "from": "AppEnlight", |
|
128 | 123 | "room_id": room.strip(), |
|
129 | 124 | "color": "red", |
|
130 | 125 | "notify": '1' |
|
131 | 126 | }) |
|
132 | 127 | |
|
133 | 128 | def notify_uptime_alert(self, **kwargs): |
|
134 | 129 | """ |
|
135 | 130 | Build and send uptime alert notification |
|
136 | 131 | |
|
137 | 132 | Kwargs: |
|
138 | 133 | application: application that the event applies for, |
|
139 | 134 | event: event that is notified, |
|
140 | 135 | user: user that should be notified |
|
141 | 136 | request: request object |
|
142 | 137 | |
|
143 | 138 | """ |
|
144 | 139 | template_vars = self.uptime_alert_notification_vars(kwargs) |
|
145 | 140 | |
|
146 | 141 | message = 'ALERT %s: %s has uptime issues\n' % ( |
|
147 | 142 | template_vars['alert_action'], |
|
148 | 143 | template_vars['resource_name'], |
|
149 | 144 | ) |
|
150 | 145 | message += template_vars['reason'] |
|
151 | 146 | message += '\n{}'.format(template_vars['destination_url']) |
|
152 | 147 | |
|
153 | 148 | api_token = self.integration.config['api_token'] |
|
154 | 149 | client = HipchatIntegration.create_client(api_token) |
|
155 | 150 | for room in self.integration.config['rooms'].split(','): |
|
156 | 151 | client.send({ |
|
157 | 152 | "message_format": 'text', |
|
158 | 153 | "message": message, |
|
159 | 154 | "from": "AppEnlight", |
|
160 | 155 | "room_id": room.strip(), |
|
161 | 156 | "color": "red", |
|
162 | 157 | "notify": '1' |
|
163 | 158 | }) |
|
164 | 159 | |
|
165 | 160 | def notify_chart_alert(self, **kwargs): |
|
166 | 161 | """ |
|
167 | 162 | Build and send chart alert notification |
|
168 | 163 | |
|
169 | 164 | Kwargs: |
|
170 | 165 | application: application that the event applies for, |
|
171 | 166 | event: event that is notified, |
|
172 | 167 | user: user that should be notified |
|
173 | 168 | request: request object |
|
174 | 169 | |
|
175 | 170 | """ |
|
176 | 171 | template_vars = self.chart_alert_notification_vars(kwargs) |
|
177 | 172 | message = 'ALERT {}: value in "{}" chart: ' \ |
|
178 | 173 | 'met alert "{}" criteria\n'.format( |
|
179 | 174 | template_vars['alert_action'], |
|
180 | 175 | template_vars['chart_name'], |
|
181 | 176 | template_vars['action_name'], |
|
182 | 177 | ) |
|
183 | 178 | |
|
184 | 179 | for item in template_vars['readable_values']: |
|
185 | 180 | message += '{}: {}\n'.format(item['label'], item['value']) |
|
186 | 181 | |
|
187 | 182 | message += template_vars['destination_url'] |
|
188 | 183 | |
|
189 | 184 | api_token = self.integration.config['api_token'] |
|
190 | 185 | client = HipchatIntegration.create_client(api_token) |
|
191 | 186 | for room in self.integration.config['rooms'].split(','): |
|
192 | 187 | client.send({ |
|
193 | 188 | "message_format": 'text', |
|
194 | 189 | "message": message, |
|
195 | 190 | "from": "AppEnlight", |
|
196 | 191 | "room_id": room.strip(), |
|
197 | 192 | "color": "red", |
|
198 | 193 | "notify": '1' |
|
199 | 194 | }) |
|
200 | 195 | |
|
201 | 196 | def send_digest(self, **kwargs): |
|
202 | 197 | """ |
|
203 | 198 | Build and send daily digest notification |
|
204 | 199 | |
|
205 | 200 | kwargs: |
|
206 | 201 | application: application that the event applies for, |
|
207 | 202 | user: user that should be notified |
|
208 | 203 | request: request object |
|
209 | 204 | since_when: reports are newer than this time value, |
|
210 | 205 | reports: list of reports to render |
|
211 | 206 | |
|
212 | 207 | """ |
|
213 | 208 | template_vars = self.report_alert_notification_vars(kwargs) |
|
214 | 209 | f_args = (template_vars['resource_name'], |
|
215 | 210 | template_vars['confirmed_total'],) |
|
216 | 211 | message = "Daily report digest: %s - %s reports" % f_args |
|
217 | 212 | message += '\n{}'.format(template_vars['destination_url']) |
|
218 | 213 | api_token = self.integration.config['api_token'] |
|
219 | 214 | client = HipchatIntegration.create_client(api_token) |
|
220 | 215 | for room in self.integration.config['rooms'].split(','): |
|
221 | 216 | client.send({ |
|
222 | 217 | "message_format": 'text', |
|
223 | 218 | "message": message, |
|
224 | 219 | "from": "AppEnlight", |
|
225 | 220 | "room_id": room.strip(), |
|
226 | 221 | "color": "green", |
|
227 | 222 | "notify": '1' |
|
228 | 223 | }) |
|
229 | 224 | |
|
230 | 225 | log_msg = 'DIGEST : %s via %s :: %s reports' % ( |
|
231 | 226 | kwargs['user'].user_name, |
|
232 | 227 | self.channel_visible_value, |
|
233 | 228 | template_vars['confirmed_total']) |
|
234 | 229 | log.warning(log_msg) |
@@ -1,290 +1,285 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import logging |
|
23 | 18 | from appenlight.models.alert_channel import AlertChannel |
|
24 | 19 | from appenlight.models.integrations.slack import SlackIntegration |
|
25 | 20 | from webhelpers2.text import truncate |
|
26 | 21 | |
|
27 | 22 | log = logging.getLogger(__name__) |
|
28 | 23 | |
|
29 | 24 | |
|
30 | 25 | class SlackAlertChannel(AlertChannel): |
|
31 | 26 | __mapper_args__ = { |
|
32 | 27 | 'polymorphic_identity': 'slack' |
|
33 | 28 | } |
|
34 | 29 | |
|
35 | 30 | def notify_reports(self, **kwargs): |
|
36 | 31 | """ |
|
37 | 32 | Notify user of individual reports |
|
38 | 33 | |
|
39 | 34 | kwargs: |
|
40 | 35 | application: application that the event applies for, |
|
41 | 36 | user: user that should be notified |
|
42 | 37 | request: request object |
|
43 | 38 | since_when: reports are newer than this time value, |
|
44 | 39 | reports: list of reports to render |
|
45 | 40 | |
|
46 | 41 | """ |
|
47 | 42 | template_vars = self.report_alert_notification_vars(kwargs) |
|
48 | 43 | template_vars["title"] = template_vars['resource_name'] |
|
49 | 44 | |
|
50 | 45 | if template_vars['confirmed_total'] > 1: |
|
51 | 46 | template_vars['subtext'] = '%s reports' % template_vars[ |
|
52 | 47 | 'confirmed_total'] |
|
53 | 48 | else: |
|
54 | 49 | error_title = truncate(template_vars['reports'][0][1].error or |
|
55 | 50 | 'slow report', 90) |
|
56 | 51 | template_vars['subtext'] = error_title |
|
57 | 52 | |
|
58 | 53 | log_msg = 'NOTIFY : %s via %s :: %s reports' % ( |
|
59 | 54 | kwargs['user'].user_name, |
|
60 | 55 | self.channel_visible_value, |
|
61 | 56 | template_vars['confirmed_total']) |
|
62 | 57 | log.warning(log_msg) |
|
63 | 58 | |
|
64 | 59 | client = SlackIntegration.create_client( |
|
65 | 60 | self.integration.config['webhook_url']) |
|
66 | 61 | report_data = { |
|
67 | 62 | "username": "AppEnlight", |
|
68 | 63 | "icon_emoji": ":fire:", |
|
69 | 64 | "attachments": [ |
|
70 | 65 | { |
|
71 | 66 | "mrkdwn_in": ["text", "pretext", "title", "fallback"], |
|
72 | 67 | "fallback": "*%s* - <%s| Browse>" % ( |
|
73 | 68 | template_vars["title"], |
|
74 | 69 | template_vars['destination_url']), |
|
75 | 70 | "pretext": "*%s* - <%s| Browse>" % ( |
|
76 | 71 | template_vars["title"], |
|
77 | 72 | template_vars['destination_url']), |
|
78 | 73 | "color": "warning", |
|
79 | 74 | "fields": [ |
|
80 | 75 | { |
|
81 | 76 | "value": 'Info: %s' % template_vars['subtext'], |
|
82 | 77 | "short": False |
|
83 | 78 | } |
|
84 | 79 | ] |
|
85 | 80 | } |
|
86 | 81 | ] |
|
87 | 82 | } |
|
88 | 83 | client.make_request(data=report_data) |
|
89 | 84 | |
|
90 | 85 | def notify_report_alert(self, **kwargs): |
|
91 | 86 | """ |
|
92 | 87 | Build and send report alert notification |
|
93 | 88 | |
|
94 | 89 | Kwargs: |
|
95 | 90 | application: application that the event applies for, |
|
96 | 91 | event: event that is notified, |
|
97 | 92 | user: user that should be notified |
|
98 | 93 | request: request object |
|
99 | 94 | |
|
100 | 95 | """ |
|
101 | 96 | template_vars = self.report_alert_notification_vars(kwargs) |
|
102 | 97 | |
|
103 | 98 | if kwargs['event'].unified_alert_action() == 'OPEN': |
|
104 | 99 | title = '*ALERT %s*: %s' % ( |
|
105 | 100 | template_vars['alert_action'], |
|
106 | 101 | template_vars['resource_name'] |
|
107 | 102 | ) |
|
108 | 103 | |
|
109 | 104 | template_vars['subtext'] = 'Got at least %s %s' % ( |
|
110 | 105 | kwargs['event'].values['reports'], |
|
111 | 106 | template_vars['report_type'] |
|
112 | 107 | ) |
|
113 | 108 | |
|
114 | 109 | else: |
|
115 | 110 | title = '*ALERT %s*: %s' % ( |
|
116 | 111 | template_vars['alert_action'], |
|
117 | 112 | template_vars['resource_name'], |
|
118 | 113 | ) |
|
119 | 114 | |
|
120 | 115 | template_vars['subtext'] = '' |
|
121 | 116 | |
|
122 | 117 | alert_type = template_vars['alert_type'].replace('_', ' ') |
|
123 | 118 | alert_type = alert_type.replace('alert', '').capitalize() |
|
124 | 119 | |
|
125 | 120 | template_vars['type'] = "Type: %s" % alert_type |
|
126 | 121 | |
|
127 | 122 | client = SlackIntegration.create_client( |
|
128 | 123 | self.integration.config['webhook_url'] |
|
129 | 124 | ) |
|
130 | 125 | report_data = { |
|
131 | 126 | "username": "AppEnlight", |
|
132 | 127 | "icon_emoji": ":rage:", |
|
133 | 128 | "attachments": [ |
|
134 | 129 | { |
|
135 | 130 | "mrkdwn_in": ["text", "pretext", "title", "fallback"], |
|
136 | 131 | "fallback": "%s - <%s| Browse>" % ( |
|
137 | 132 | title, template_vars['destination_url']), |
|
138 | 133 | "pretext": "%s - <%s| Browse>" % ( |
|
139 | 134 | title, template_vars['destination_url']), |
|
140 | 135 | "color": "danger", |
|
141 | 136 | "fields": [ |
|
142 | 137 | { |
|
143 | 138 | "title": template_vars['type'], |
|
144 | 139 | "value": template_vars['subtext'], |
|
145 | 140 | "short": False |
|
146 | 141 | } |
|
147 | 142 | ] |
|
148 | 143 | } |
|
149 | 144 | ] |
|
150 | 145 | } |
|
151 | 146 | client.make_request(data=report_data) |
|
152 | 147 | |
|
153 | 148 | def notify_uptime_alert(self, **kwargs): |
|
154 | 149 | """ |
|
155 | 150 | Build and send uptime alert notification |
|
156 | 151 | |
|
157 | 152 | Kwargs: |
|
158 | 153 | application: application that the event applies for, |
|
159 | 154 | event: event that is notified, |
|
160 | 155 | user: user that should be notified |
|
161 | 156 | request: request object |
|
162 | 157 | |
|
163 | 158 | """ |
|
164 | 159 | template_vars = self.uptime_alert_notification_vars(kwargs) |
|
165 | 160 | |
|
166 | 161 | title = '*ALERT %s*: %s' % ( |
|
167 | 162 | template_vars['alert_action'], |
|
168 | 163 | template_vars['resource_name'], |
|
169 | 164 | ) |
|
170 | 165 | client = SlackIntegration.create_client( |
|
171 | 166 | self.integration.config['webhook_url'] |
|
172 | 167 | ) |
|
173 | 168 | report_data = { |
|
174 | 169 | "username": "AppEnlight", |
|
175 | 170 | "icon_emoji": ":rage:", |
|
176 | 171 | "attachments": [ |
|
177 | 172 | { |
|
178 | 173 | "mrkdwn_in": ["text", "pretext", "title", "fallback"], |
|
179 | 174 | "fallback": "{} - <{}| Browse>".format( |
|
180 | 175 | title, template_vars['destination_url']), |
|
181 | 176 | "pretext": "{} - <{}| Browse>".format( |
|
182 | 177 | title, template_vars['destination_url']), |
|
183 | 178 | "color": "danger", |
|
184 | 179 | "fields": [ |
|
185 | 180 | { |
|
186 | 181 | "title": "Application has uptime issues", |
|
187 | 182 | "value": template_vars['reason'], |
|
188 | 183 | "short": False |
|
189 | 184 | } |
|
190 | 185 | ] |
|
191 | 186 | } |
|
192 | 187 | ] |
|
193 | 188 | } |
|
194 | 189 | client.make_request(data=report_data) |
|
195 | 190 | |
|
196 | 191 | def notify_chart_alert(self, **kwargs): |
|
197 | 192 | """ |
|
198 | 193 | Build and send chart alert notification |
|
199 | 194 | |
|
200 | 195 | Kwargs: |
|
201 | 196 | application: application that the event applies for, |
|
202 | 197 | event: event that is notified, |
|
203 | 198 | user: user that should be notified |
|
204 | 199 | request: request object |
|
205 | 200 | |
|
206 | 201 | """ |
|
207 | 202 | template_vars = self.chart_alert_notification_vars(kwargs) |
|
208 | 203 | |
|
209 | 204 | title = '*ALERT {}*: value in *"{}"* chart ' \ |
|
210 | 205 | 'met alert *"{}"* criteria'.format( |
|
211 | 206 | template_vars['alert_action'], |
|
212 | 207 | template_vars['chart_name'], |
|
213 | 208 | template_vars['action_name'], |
|
214 | 209 | ) |
|
215 | 210 | |
|
216 | 211 | subtext = '' |
|
217 | 212 | for item in template_vars['readable_values']: |
|
218 | 213 | subtext += '{} - {}\n'.format(item['label'], item['value']) |
|
219 | 214 | |
|
220 | 215 | client = SlackIntegration.create_client( |
|
221 | 216 | self.integration.config['webhook_url'] |
|
222 | 217 | ) |
|
223 | 218 | report_data = { |
|
224 | 219 | "username": "AppEnlight", |
|
225 | 220 | "icon_emoji": ":rage:", |
|
226 | 221 | "attachments": [ |
|
227 | 222 | {"mrkdwn_in": ["text", "pretext", "title", "fallback"], |
|
228 | 223 | "fallback": "{} - <{}| Browse>".format( |
|
229 | 224 | title, template_vars['destination_url']), |
|
230 | 225 | "pretext": "{} - <{}| Browse>".format( |
|
231 | 226 | title, template_vars['destination_url']), |
|
232 | 227 | "color": "danger", |
|
233 | 228 | "fields": [ |
|
234 | 229 | { |
|
235 | 230 | "title": "Following criteria were met:", |
|
236 | 231 | "value": subtext, |
|
237 | 232 | "short": False |
|
238 | 233 | } |
|
239 | 234 | ] |
|
240 | 235 | } |
|
241 | 236 | ] |
|
242 | 237 | } |
|
243 | 238 | client.make_request(data=report_data) |
|
244 | 239 | |
|
245 | 240 | def send_digest(self, **kwargs): |
|
246 | 241 | """ |
|
247 | 242 | Build and send daily digest notification |
|
248 | 243 | |
|
249 | 244 | kwargs: |
|
250 | 245 | application: application that the event applies for, |
|
251 | 246 | user: user that should be notified |
|
252 | 247 | request: request object |
|
253 | 248 | since_when: reports are newer than this time value, |
|
254 | 249 | reports: list of reports to render |
|
255 | 250 | |
|
256 | 251 | """ |
|
257 | 252 | template_vars = self.report_alert_notification_vars(kwargs) |
|
258 | 253 | title = "*Daily report digest*: %s" % template_vars['resource_name'] |
|
259 | 254 | |
|
260 | 255 | subtext = '%s reports' % template_vars['confirmed_total'] |
|
261 | 256 | |
|
262 | 257 | client = SlackIntegration.create_client( |
|
263 | 258 | self.integration.config['webhook_url'] |
|
264 | 259 | ) |
|
265 | 260 | report_data = { |
|
266 | 261 | "username": "AppEnlight", |
|
267 | 262 | "attachments": [ |
|
268 | 263 | { |
|
269 | 264 | "mrkdwn_in": ["text", "pretext", "title", "fallback"], |
|
270 | 265 | "fallback": "%s : <%s| Browse>" % ( |
|
271 | 266 | title, template_vars['destination_url']), |
|
272 | 267 | "pretext": "%s: <%s| Browse>" % ( |
|
273 | 268 | title, template_vars['destination_url']), |
|
274 | 269 | "color": "good", |
|
275 | 270 | "fields": [ |
|
276 | 271 | { |
|
277 | 272 | "title": "Got at least: %s" % subtext, |
|
278 | 273 | "short": False |
|
279 | 274 | } |
|
280 | 275 | ] |
|
281 | 276 | } |
|
282 | 277 | ] |
|
283 | 278 | } |
|
284 | 279 | client.make_request(data=report_data) |
|
285 | 280 | |
|
286 | 281 | log_msg = 'DIGEST : %s via %s :: %s reports' % ( |
|
287 | 282 | kwargs['user'].user_name, |
|
288 | 283 | self.channel_visible_value, |
|
289 | 284 | template_vars['confirmed_total']) |
|
290 | 285 | log.warning(log_msg) |
@@ -1,109 +1,104 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import uuid |
|
23 | 18 | import logging |
|
24 | 19 | import sqlalchemy as sa |
|
25 | 20 | from appenlight.models.resource import Resource |
|
26 | 21 | from sqlalchemy.orm import aliased |
|
27 | 22 | |
|
28 | 23 | log = logging.getLogger(__name__) |
|
29 | 24 | |
|
30 | 25 | |
|
31 | 26 | def generate_api_key(): |
|
32 | 27 | uid = str(uuid.uuid4()).replace('-', '') |
|
33 | 28 | return uid[0:32] |
|
34 | 29 | |
|
35 | 30 | |
|
36 | 31 | class Application(Resource): |
|
37 | 32 | """ |
|
38 | 33 | Resource of application type |
|
39 | 34 | """ |
|
40 | 35 | |
|
41 | 36 | __tablename__ = 'applications' |
|
42 | 37 | __mapper_args__ = {'polymorphic_identity': 'application'} |
|
43 | 38 | |
|
44 | 39 | # lists configurable possible permissions for this resource type |
|
45 | 40 | __possible_permissions__ = ('view', 'update_reports') |
|
46 | 41 | |
|
47 | 42 | resource_id = sa.Column(sa.Integer(), |
|
48 | 43 | sa.ForeignKey('resources.resource_id', |
|
49 | 44 | onupdate='CASCADE', |
|
50 | 45 | ondelete='CASCADE', ), |
|
51 | 46 | primary_key=True, ) |
|
52 | 47 | domains = sa.Column(sa.UnicodeText(), nullable=False, default='') |
|
53 | 48 | api_key = sa.Column(sa.String(32), nullable=False, unique=True, index=True, |
|
54 | 49 | default=generate_api_key) |
|
55 | 50 | public_key = sa.Column(sa.String(32), nullable=False, unique=True, |
|
56 | 51 | index=True, |
|
57 | 52 | default=generate_api_key) |
|
58 | 53 | default_grouping = sa.Column(sa.Unicode(20), nullable=False, |
|
59 | 54 | default='url_traceback') |
|
60 | 55 | error_report_threshold = sa.Column(sa.Integer(), default=10) |
|
61 | 56 | slow_report_threshold = sa.Column(sa.Integer(), default=10) |
|
62 | 57 | allow_permanent_storage = sa.Column(sa.Boolean(), default=False, |
|
63 | 58 | nullable=False) |
|
64 | 59 | |
|
65 | 60 | @sa.orm.validates('default_grouping') |
|
66 | 61 | def validate_default_grouping(self, key, grouping): |
|
67 | 62 | """ validate if resouce can have specific permission """ |
|
68 | 63 | assert grouping in ['url_type', 'url_traceback', 'traceback_server'] |
|
69 | 64 | return grouping |
|
70 | 65 | |
|
71 | 66 | report_groups = sa.orm.relationship('ReportGroup', |
|
72 | 67 | cascade="all, delete-orphan", |
|
73 | 68 | passive_deletes=True, |
|
74 | 69 | passive_updates=True, |
|
75 | 70 | lazy='dynamic', |
|
76 | 71 | backref=sa.orm.backref('application', |
|
77 | 72 | lazy="joined")) |
|
78 | 73 | |
|
79 | 74 | postprocess_conf = sa.orm.relationship('ApplicationPostprocessConf', |
|
80 | 75 | cascade="all, delete-orphan", |
|
81 | 76 | passive_deletes=True, |
|
82 | 77 | passive_updates=True, |
|
83 | 78 | backref='resource') |
|
84 | 79 | |
|
85 | 80 | logs = sa.orm.relationship('Log', |
|
86 | 81 | lazy='dynamic', |
|
87 | 82 | backref='application', |
|
88 | 83 | passive_deletes=True, |
|
89 | 84 | passive_updates=True, ) |
|
90 | 85 | |
|
91 | 86 | integrations = sa.orm.relationship('IntegrationBase', |
|
92 | 87 | backref='resource', |
|
93 | 88 | cascade="all, delete-orphan", |
|
94 | 89 | passive_deletes=True, |
|
95 | 90 | passive_updates=True, ) |
|
96 | 91 | |
|
97 | 92 | def generate_api_key(self): |
|
98 | 93 | return generate_api_key() |
|
99 | 94 | |
|
100 | 95 | |
|
101 | 96 | def after_update(mapper, connection, target): |
|
102 | 97 | from appenlight.models.services.application import ApplicationService |
|
103 | 98 | log.info('clearing out ApplicationService cache') |
|
104 | 99 | ApplicationService.by_id_cached().invalidate(target.resource_id) |
|
105 | 100 | ApplicationService.by_api_key_cached().invalidate(target.api_key) |
|
106 | 101 | |
|
107 | 102 | |
|
108 | 103 | sa.event.listen(Application, 'after_update', after_update) |
|
109 | 104 | sa.event.listen(Application, 'after_delete', after_update) |
@@ -1,50 +1,45 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | from ziggurat_foundations.models.base import BaseModel |
|
23 | 18 | import sqlalchemy as sa |
|
24 | 19 | |
|
25 | 20 | from appenlight.models import Base |
|
26 | 21 | from appenlight.models.report_group import ReportGroup |
|
27 | 22 | |
|
28 | 23 | |
|
29 | 24 | class ApplicationPostprocessConf(Base, BaseModel): |
|
30 | 25 | """ |
|
31 | 26 | Stores prioritizing conditions for reports |
|
32 | 27 | This is later used for rule parsing like "if 10 occurences bump priority +1" |
|
33 | 28 | """ |
|
34 | 29 | |
|
35 | 30 | __tablename__ = 'application_postprocess_conf' |
|
36 | 31 | |
|
37 | 32 | pkey = sa.Column(sa.Integer(), nullable=False, primary_key=True) |
|
38 | 33 | resource_id = sa.Column(sa.Integer(), |
|
39 | 34 | sa.ForeignKey('resources.resource_id', |
|
40 | 35 | onupdate='CASCADE', |
|
41 | 36 | ondelete='CASCADE')) |
|
42 | 37 | do = sa.Column(sa.Unicode(25), nullable=False) |
|
43 | 38 | new_value = sa.Column(sa.UnicodeText(), nullable=False, default='') |
|
44 | 39 | rule = sa.Column(sa.dialects.postgresql.JSON, |
|
45 | 40 | nullable=False, default={'field': 'http_status', |
|
46 | 41 | "op": "ge", "value": "500"}) |
|
47 | 42 | |
|
48 | 43 | def postprocess(self, item): |
|
49 | 44 | new_value = int(self.new_value) |
|
50 | 45 | item.priority = ReportGroup.priority + new_value |
@@ -1,57 +1,52 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import logging |
|
23 | 18 | import sqlalchemy as sa |
|
24 | 19 | |
|
25 | 20 | from datetime import datetime |
|
26 | 21 | from appenlight.models import Base |
|
27 | 22 | from ziggurat_foundations.models.base import BaseModel |
|
28 | 23 | from ziggurat_foundations.models.services.user import UserService |
|
29 | 24 | |
|
30 | 25 | log = logging.getLogger(__name__) |
|
31 | 26 | |
|
32 | 27 | |
|
33 | 28 | class AuthToken(Base, BaseModel): |
|
34 | 29 | """ |
|
35 | 30 | Stores information about possible alerting options |
|
36 | 31 | """ |
|
37 | 32 | __tablename__ = 'auth_tokens' |
|
38 | 33 | |
|
39 | 34 | id = sa.Column(sa.Integer, primary_key=True, nullable=False) |
|
40 | 35 | token = sa.Column(sa.Unicode(40), nullable=False, |
|
41 | 36 | default=lambda x: UserService.generate_random_string(40)) |
|
42 | 37 | owner_id = sa.Column(sa.Unicode(30), |
|
43 | 38 | sa.ForeignKey('users.id', onupdate='CASCADE', |
|
44 | 39 | ondelete='CASCADE')) |
|
45 | 40 | creation_date = sa.Column(sa.DateTime, default=lambda x: datetime.utcnow()) |
|
46 | 41 | expires = sa.Column(sa.DateTime) |
|
47 | 42 | description = sa.Column(sa.Unicode, default='') |
|
48 | 43 | |
|
49 | 44 | @property |
|
50 | 45 | def is_expired(self): |
|
51 | 46 | if self.expires: |
|
52 | 47 | return self.expires < datetime.utcnow() |
|
53 | 48 | else: |
|
54 | 49 | return False |
|
55 | 50 | |
|
56 | 51 | def __str__(self): |
|
57 | 52 | return '<AuthToken u:%s t:%s...>' % (self.owner_id, self.token[0:10]) |
@@ -1,37 +1,32 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import sqlalchemy as sa |
|
23 | 18 | from ziggurat_foundations.models.base import BaseModel |
|
24 | 19 | from sqlalchemy.dialects.postgres import JSON |
|
25 | 20 | |
|
26 | 21 | from . import Base |
|
27 | 22 | |
|
28 | 23 | |
|
29 | 24 | class Config(Base, BaseModel): |
|
30 | 25 | __tablename__ = 'config' |
|
31 | 26 | |
|
32 | 27 | key = sa.Column(sa.Unicode, primary_key=True) |
|
33 | 28 | section = sa.Column(sa.Unicode, primary_key=True) |
|
34 | 29 | value = sa.Column(JSON, nullable=False) |
|
35 | 30 | |
|
36 | 31 | def __json__(self, request): |
|
37 | 32 | return self.get_dict() |
@@ -1,160 +1,155 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import sqlalchemy as sa |
|
23 | 18 | import logging |
|
24 | 19 | |
|
25 | 20 | from datetime import datetime |
|
26 | 21 | from appenlight.models import Base, get_db_session |
|
27 | 22 | from appenlight.models.services.report_stat import ReportStatService |
|
28 | 23 | from appenlight.models.resource import Resource |
|
29 | 24 | from appenlight.models.integrations import IntegrationException |
|
30 | 25 | from pyramid.threadlocal import get_current_request |
|
31 | 26 | from sqlalchemy.dialects.postgresql import JSON |
|
32 | 27 | from ziggurat_foundations.models.base import BaseModel |
|
33 | 28 | |
|
34 | 29 | log = logging.getLogger(__name__) |
|
35 | 30 | |
|
36 | 31 | |
|
37 | 32 | class Event(Base, BaseModel): |
|
38 | 33 | __tablename__ = 'events' |
|
39 | 34 | |
|
40 | 35 | types = {'error_report_alert': 1, |
|
41 | 36 | 'slow_report_alert': 3, |
|
42 | 37 | 'comment': 5, |
|
43 | 38 | 'assignment': 6, |
|
44 | 39 | 'uptime_alert': 7, |
|
45 | 40 | 'chart_alert': 9} |
|
46 | 41 | |
|
47 | 42 | statuses = {'active': 1, |
|
48 | 43 | 'closed': 0} |
|
49 | 44 | |
|
50 | 45 | id = sa.Column(sa.Integer, primary_key=True) |
|
51 | 46 | start_date = sa.Column(sa.DateTime, default=datetime.utcnow) |
|
52 | 47 | end_date = sa.Column(sa.DateTime) |
|
53 | 48 | status = sa.Column(sa.Integer, default=1) |
|
54 | 49 | event_type = sa.Column(sa.Integer, default=1) |
|
55 | 50 | origin_user_id = sa.Column(sa.Integer(), sa.ForeignKey('users.id'), |
|
56 | 51 | nullable=True) |
|
57 | 52 | target_user_id = sa.Column(sa.Integer(), sa.ForeignKey('users.id'), |
|
58 | 53 | nullable=True) |
|
59 | 54 | resource_id = sa.Column(sa.Integer(), |
|
60 | 55 | sa.ForeignKey('resources.resource_id'), |
|
61 | 56 | nullable=True) |
|
62 | 57 | target_id = sa.Column(sa.Integer) |
|
63 | 58 | target_uuid = sa.Column(sa.Unicode(40)) |
|
64 | 59 | text = sa.Column(sa.UnicodeText()) |
|
65 | 60 | values = sa.Column(JSON(), nullable=False, default=None) |
|
66 | 61 | |
|
67 | 62 | def __repr__(self): |
|
68 | 63 | return '<Event %s, app:%s, %s>' % (self.unified_alert_name(), |
|
69 | 64 | self.resource_id, |
|
70 | 65 | self.unified_alert_action()) |
|
71 | 66 | |
|
72 | 67 | @property |
|
73 | 68 | def reverse_types(self): |
|
74 | 69 | return dict([(v, k) for k, v in self.types.items()]) |
|
75 | 70 | |
|
76 | 71 | def unified_alert_name(self): |
|
77 | 72 | return self.reverse_types[self.event_type] |
|
78 | 73 | |
|
79 | 74 | def unified_alert_action(self): |
|
80 | 75 | event_name = self.reverse_types[self.event_type] |
|
81 | 76 | if self.status == Event.statuses['closed']: |
|
82 | 77 | return "CLOSE" |
|
83 | 78 | if self.status != Event.statuses['closed']: |
|
84 | 79 | return "OPEN" |
|
85 | 80 | return event_name |
|
86 | 81 | |
|
87 | 82 | def send_alerts(self, request=None, resource=None, db_session=None): |
|
88 | 83 | """" Sends alerts to applicable channels """ |
|
89 | 84 | db_session = get_db_session(db_session) |
|
90 | 85 | db_session.flush() |
|
91 | 86 | if not resource: |
|
92 | 87 | resource = Resource.by_resource_id(self.resource_id) |
|
93 | 88 | if not request: |
|
94 | 89 | request = get_current_request() |
|
95 | 90 | if not resource: |
|
96 | 91 | return |
|
97 | 92 | users = set([p.user for p in resource.users_for_perm('view')]) |
|
98 | 93 | for user in users: |
|
99 | 94 | for channel in user.alert_channels: |
|
100 | 95 | if not channel.channel_validated or not channel.send_alerts: |
|
101 | 96 | continue |
|
102 | 97 | else: |
|
103 | 98 | try: |
|
104 | 99 | channel.notify_alert(resource=resource, |
|
105 | 100 | event=self, |
|
106 | 101 | user=user, |
|
107 | 102 | request=request) |
|
108 | 103 | except IntegrationException as e: |
|
109 | 104 | log.warning('%s' % e) |
|
110 | 105 | |
|
111 | 106 | def validate_or_close(self, since_when, db_session=None): |
|
112 | 107 | """ Checks if alerts should stay open or it's time to close them. |
|
113 | 108 | Generates close alert event if alerts get closed """ |
|
114 | 109 | event_types = [Event.types['error_report_alert'], |
|
115 | 110 | Event.types['slow_report_alert']] |
|
116 | 111 | app = Resource.by_resource_id(self.resource_id) |
|
117 | 112 | if self.event_type in event_types: |
|
118 | 113 | total = ReportStatService.count_by_type( |
|
119 | 114 | self.event_type, self.resource_id, since_when) |
|
120 | 115 | if Event.types['error_report_alert'] == self.event_type: |
|
121 | 116 | threshold = app.error_report_threshold |
|
122 | 117 | if Event.types['slow_report_alert'] == self.event_type: |
|
123 | 118 | threshold = app.slow_report_threshold |
|
124 | 119 | |
|
125 | 120 | if total < threshold: |
|
126 | 121 | self.close() |
|
127 | 122 | |
|
128 | 123 | def close(self, db_session=None): |
|
129 | 124 | """ |
|
130 | 125 | Closes an event and sends notification to affected users |
|
131 | 126 | """ |
|
132 | 127 | self.end_date = datetime.utcnow() |
|
133 | 128 | self.status = Event.statuses['closed'] |
|
134 | 129 | log.warning('ALERT: CLOSE: %s' % self) |
|
135 | 130 | self.send_alerts() |
|
136 | 131 | |
|
137 | 132 | def text_representation(self): |
|
138 | 133 | alert_type = self.unified_alert_name() |
|
139 | 134 | text = '' |
|
140 | 135 | if 'slow_report' in alert_type: |
|
141 | 136 | text += 'Slow report alert' |
|
142 | 137 | if 'error_report' in alert_type: |
|
143 | 138 | text += 'Exception report alert' |
|
144 | 139 | if 'uptime_alert' in alert_type: |
|
145 | 140 | text += 'Uptime alert' |
|
146 | 141 | if 'chart_alert' in alert_type: |
|
147 | 142 | text += 'Metrics value alert' |
|
148 | 143 | |
|
149 | 144 | alert_action = self.unified_alert_action() |
|
150 | 145 | if alert_action == 'OPEN': |
|
151 | 146 | text += ' got opened.' |
|
152 | 147 | if alert_action == 'CLOSE': |
|
153 | 148 | text += ' got closed.' |
|
154 | 149 | return text |
|
155 | 150 | |
|
156 | 151 | def get_dict(self, request=None): |
|
157 | 152 | dict_data = super(Event, self).get_dict() |
|
158 | 153 | dict_data['text'] = self.text_representation() |
|
159 | 154 | dict_data['resource_name'] = self.resource.resource_name |
|
160 | 155 | return dict_data |
@@ -1,41 +1,36 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import sqlalchemy as sa |
|
23 | 18 | from sqlalchemy.ext.declarative import declared_attr |
|
24 | 19 | from ziggurat_foundations.models.external_identity import ExternalIdentityMixin |
|
25 | 20 | |
|
26 | 21 | from appenlight.models import Base |
|
27 | 22 | from appenlight.lib.sqlalchemy_fields import EncryptedUnicode |
|
28 | 23 | |
|
29 | 24 | |
|
30 | 25 | class ExternalIdentity(ExternalIdentityMixin, Base): |
|
31 | 26 | @declared_attr |
|
32 | 27 | def access_token(self): |
|
33 | 28 | return sa.Column(EncryptedUnicode(255), default='') |
|
34 | 29 | |
|
35 | 30 | @declared_attr |
|
36 | 31 | def alt_token(self): |
|
37 | 32 | return sa.Column(EncryptedUnicode(255), default='') |
|
38 | 33 | |
|
39 | 34 | @declared_attr |
|
40 | 35 | def token_secret(self): |
|
41 | 36 | return sa.Column(EncryptedUnicode(255), default='') |
@@ -1,50 +1,45 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | from ziggurat_foundations.models.group import GroupMixin |
|
23 | 18 | from appenlight.models import Base |
|
24 | 19 | |
|
25 | 20 | |
|
26 | 21 | class Group(GroupMixin, Base): |
|
27 | 22 | __possible_permissions__ = ('root_administration', |
|
28 | 23 | 'test_features', |
|
29 | 24 | 'admin_panel', |
|
30 | 25 | 'admin_users', |
|
31 | 26 | 'manage_partitions',) |
|
32 | 27 | |
|
33 | 28 | def get_dict(self, exclude_keys=None, include_keys=None, |
|
34 | 29 | include_perms=False): |
|
35 | 30 | result = super(Group, self).get_dict(exclude_keys, include_keys) |
|
36 | 31 | if include_perms: |
|
37 | 32 | result['possible_permissions'] = self.__possible_permissions__ |
|
38 | 33 | result['current_permissions'] = [p.perm_name for p in |
|
39 | 34 | self.permissions] |
|
40 | 35 | else: |
|
41 | 36 | result['possible_permissions'] = [] |
|
42 | 37 | result['current_permissions'] = [] |
|
43 | 38 | exclude_keys_list = exclude_keys or [] |
|
44 | 39 | include_keys_list = include_keys or [] |
|
45 | 40 | d = {} |
|
46 | 41 | for k in result.keys(): |
|
47 | 42 | if (k not in exclude_keys_list and |
|
48 | 43 | (k in include_keys_list or not include_keys)): |
|
49 | 44 | d[k] = result[k] |
|
50 | 45 | return d |
@@ -1,27 +1,22 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | from ziggurat_foundations.models.group_permission import GroupPermissionMixin |
|
23 | 18 | from appenlight.models import Base |
|
24 | 19 | |
|
25 | 20 | |
|
26 | 21 | class GroupPermission(GroupPermissionMixin, Base): |
|
27 | 22 | pass |
@@ -1,28 +1,23 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | from ziggurat_foundations.models.group_resource_permission import \ |
|
23 | 18 | GroupResourcePermissionMixin |
|
24 | 19 | from appenlight.models import Base |
|
25 | 20 | |
|
26 | 21 | |
|
27 | 22 | class GroupResourcePermission(GroupResourcePermissionMixin, Base): |
|
28 | 23 | pass |
@@ -1,83 +1,78 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import sqlalchemy as sa |
|
23 | 18 | from sqlalchemy.dialects.postgresql import JSON |
|
24 | 19 | from sqlalchemy.ext.hybrid import hybrid_property |
|
25 | 20 | from ziggurat_foundations.models.base import BaseModel |
|
26 | 21 | |
|
27 | 22 | from appenlight.lib.encryption import decrypt_dictionary_keys |
|
28 | 23 | from appenlight.lib.encryption import encrypt_dictionary_keys |
|
29 | 24 | from appenlight.models import Base, get_db_session |
|
30 | 25 | |
|
31 | 26 | |
|
32 | 27 | class IntegrationException(Exception): |
|
33 | 28 | pass |
|
34 | 29 | |
|
35 | 30 | |
|
36 | 31 | class IntegrationBase(Base, BaseModel): |
|
37 | 32 | """ |
|
38 | 33 | Model from which all integrations inherit using polymorphic approach |
|
39 | 34 | """ |
|
40 | 35 | __tablename__ = 'integrations' |
|
41 | 36 | |
|
42 | 37 | front_visible = False |
|
43 | 38 | as_alert_channel = False |
|
44 | 39 | supports_report_alerting = False |
|
45 | 40 | |
|
46 | 41 | id = sa.Column(sa.Integer, primary_key=True) |
|
47 | 42 | resource_id = sa.Column(sa.Integer, |
|
48 | 43 | sa.ForeignKey('applications.resource_id')) |
|
49 | 44 | integration_name = sa.Column(sa.Unicode(64)) |
|
50 | 45 | _config = sa.Column('config', JSON(), nullable=False, default='') |
|
51 | 46 | modified_date = sa.Column(sa.DateTime) |
|
52 | 47 | |
|
53 | 48 | channel = sa.orm.relationship('AlertChannel', |
|
54 | 49 | cascade="all,delete-orphan", |
|
55 | 50 | passive_deletes=True, |
|
56 | 51 | passive_updates=True, |
|
57 | 52 | uselist=False, |
|
58 | 53 | backref='integration') |
|
59 | 54 | |
|
60 | 55 | __mapper_args__ = { |
|
61 | 56 | 'polymorphic_on': 'integration_name', |
|
62 | 57 | 'polymorphic_identity': 'integration' |
|
63 | 58 | } |
|
64 | 59 | |
|
65 | 60 | @classmethod |
|
66 | 61 | def by_app_id_and_integration_name(cls, resource_id, integration_name, |
|
67 | 62 | db_session=None): |
|
68 | 63 | db_session = get_db_session(db_session) |
|
69 | 64 | query = db_session.query(cls) |
|
70 | 65 | query = query.filter(cls.integration_name == integration_name) |
|
71 | 66 | query = query.filter(cls.resource_id == resource_id) |
|
72 | 67 | return query.first() |
|
73 | 68 | |
|
74 | 69 | @hybrid_property |
|
75 | 70 | def config(self): |
|
76 | 71 | return decrypt_dictionary_keys(self._config) |
|
77 | 72 | |
|
78 | 73 | @config.setter |
|
79 | 74 | def config(self, value): |
|
80 | 75 | if not hasattr(value, 'items'): |
|
81 | 76 | raise Exception('IntegrationBase.config only accepts ' |
|
82 | 77 | 'flat dictionaries') |
|
83 | 78 | self._config = encrypt_dictionary_keys(value) |
@@ -1,168 +1,163 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import requests |
|
23 | 18 | from requests_oauthlib import OAuth1 |
|
24 | 19 | from appenlight.models.integrations import (IntegrationBase, |
|
25 | 20 | IntegrationException) |
|
26 | 21 | |
|
27 | 22 | _ = str |
|
28 | 23 | |
|
29 | 24 | |
|
30 | 25 | class NotFoundException(Exception): |
|
31 | 26 | pass |
|
32 | 27 | |
|
33 | 28 | |
|
34 | 29 | class BitbucketIntegration(IntegrationBase): |
|
35 | 30 | __mapper_args__ = { |
|
36 | 31 | 'polymorphic_identity': 'bitbucket' |
|
37 | 32 | } |
|
38 | 33 | front_visible = True |
|
39 | 34 | as_alert_channel = False |
|
40 | 35 | supports_report_alerting = False |
|
41 | 36 | action_notification = True |
|
42 | 37 | integration_action = 'Add issue to Bitbucket' |
|
43 | 38 | |
|
44 | 39 | @classmethod |
|
45 | 40 | def create_client(cls, request, user_name=None, repo_name=None): |
|
46 | 41 | """ |
|
47 | 42 | Creates REST client that can authenticate to specific repo |
|
48 | 43 | uses auth tokens for current request user |
|
49 | 44 | """ |
|
50 | 45 | config = request.registry.settings |
|
51 | 46 | token = None |
|
52 | 47 | secret = None |
|
53 | 48 | for identity in request.user.external_identities: |
|
54 | 49 | if identity.provider_name == 'bitbucket': |
|
55 | 50 | token = identity.access_token |
|
56 | 51 | secret = identity.token_secret |
|
57 | 52 | break |
|
58 | 53 | if not token: |
|
59 | 54 | raise IntegrationException( |
|
60 | 55 | 'No valid auth token present for this service') |
|
61 | 56 | client = BitbucketClient(token, secret, |
|
62 | 57 | user_name, |
|
63 | 58 | repo_name, |
|
64 | 59 | config['authomatic.pr.bitbucket.key'], |
|
65 | 60 | config['authomatic.pr.bitbucket.secret']) |
|
66 | 61 | return client |
|
67 | 62 | |
|
68 | 63 | |
|
69 | 64 | class BitbucketClient(object): |
|
70 | 65 | api_url = 'https://bitbucket.org/api/1.0' |
|
71 | 66 | repo_type = 'bitbucket' |
|
72 | 67 | |
|
73 | 68 | def __init__(self, token, secret, owner, repo_name, bitbucket_consumer_key, |
|
74 | 69 | bitbucket_consumer_secret): |
|
75 | 70 | self.access_token = token |
|
76 | 71 | self.token_secret = secret |
|
77 | 72 | self.owner = owner |
|
78 | 73 | self.repo_name = repo_name |
|
79 | 74 | self.bitbucket_consumer_key = bitbucket_consumer_key |
|
80 | 75 | self.bitbucket_consumer_secret = bitbucket_consumer_secret |
|
81 | 76 | |
|
82 | 77 | possible_keys = { |
|
83 | 78 | 'status': ['new', 'open', 'resolved', 'on hold', 'invalid', |
|
84 | 79 | 'duplicate', 'wontfix'], |
|
85 | 80 | 'priority': ['trivial', 'minor', 'major', 'critical', 'blocker'], |
|
86 | 81 | 'kind': ['bug', 'enhancement', 'proposal', 'task'] |
|
87 | 82 | } |
|
88 | 83 | |
|
89 | 84 | def get_statuses(self): |
|
90 | 85 | """Gets list of possible item statuses""" |
|
91 | 86 | return self.possible_keys['status'] |
|
92 | 87 | |
|
93 | 88 | def get_priorities(self): |
|
94 | 89 | """Gets list of possible item statuses""" |
|
95 | 90 | return self.possible_keys['priority'] |
|
96 | 91 | |
|
97 | 92 | def make_request(self, url, method='get', data=None, headers=None): |
|
98 | 93 | """ |
|
99 | 94 | Performs HTTP request to bitbucket |
|
100 | 95 | """ |
|
101 | 96 | auth = OAuth1(self.bitbucket_consumer_key, |
|
102 | 97 | self.bitbucket_consumer_secret, |
|
103 | 98 | self.access_token, self.token_secret) |
|
104 | 99 | try: |
|
105 | 100 | resp = getattr(requests, method)(url, data=data, auth=auth, |
|
106 | 101 | timeout=10) |
|
107 | 102 | except Exception as e: |
|
108 | 103 | raise IntegrationException( |
|
109 | 104 | _('Error communicating with Bitbucket: %s') % (e,)) |
|
110 | 105 | if resp.status_code == 401: |
|
111 | 106 | raise IntegrationException( |
|
112 | 107 | _('You are not authorized to access this repo')) |
|
113 | 108 | elif resp.status_code == 404: |
|
114 | 109 | raise IntegrationException(_('User or repo name are incorrect')) |
|
115 | 110 | elif resp.status_code not in [200, 201]: |
|
116 | 111 | raise IntegrationException( |
|
117 | 112 | _('Bitbucket response_code: %s') % resp.status_code) |
|
118 | 113 | try: |
|
119 | 114 | return resp.json() |
|
120 | 115 | except Exception as e: |
|
121 | 116 | raise IntegrationException( |
|
122 | 117 | _('Error decoding response from Bitbucket: %s') % (e,)) |
|
123 | 118 | |
|
124 | 119 | def get_assignees(self): |
|
125 | 120 | """Gets list of possible assignees""" |
|
126 | 121 | url = '%(api_url)s/privileges/%(owner)s/%(repo_name)s' % { |
|
127 | 122 | 'api_url': self.api_url, |
|
128 | 123 | 'owner': self.owner, |
|
129 | 124 | 'repo_name': self.repo_name} |
|
130 | 125 | |
|
131 | 126 | data = self.make_request(url) |
|
132 | 127 | results = [{'user': self.owner, 'name': '(Repo owner)'}] |
|
133 | 128 | if data: |
|
134 | 129 | for entry in data: |
|
135 | 130 | results.append({"user": entry['user']['username'], |
|
136 | 131 | "name": entry['user'].get('display_name')}) |
|
137 | 132 | return results |
|
138 | 133 | |
|
139 | 134 | def create_issue(self, form_data): |
|
140 | 135 | """ |
|
141 | 136 | Sends creates a new issue in tracker using REST call |
|
142 | 137 | """ |
|
143 | 138 | url = '%(api_url)s/repositories/%(owner)s/%(repo_name)s/issues/' % { |
|
144 | 139 | 'api_url': self.api_url, |
|
145 | 140 | 'owner': self.owner, |
|
146 | 141 | 'repo_name': self.repo_name} |
|
147 | 142 | |
|
148 | 143 | payload = { |
|
149 | 144 | "title": form_data['title'], |
|
150 | 145 | "content": form_data['content'], |
|
151 | 146 | "kind": form_data['kind'], |
|
152 | 147 | "priority": form_data['priority'], |
|
153 | 148 | "responsible": form_data['responsible'] |
|
154 | 149 | } |
|
155 | 150 | data = self.make_request(url, 'post', payload) |
|
156 | 151 | f_args = { |
|
157 | 152 | "owner": self.owner, |
|
158 | 153 | "repo_name": self.repo_name, |
|
159 | 154 | "issue_id": data['local_id'] |
|
160 | 155 | } |
|
161 | 156 | web_url = 'https://bitbucket.org/%(owner)s/%(repo_name)s' \ |
|
162 | 157 | '/issue/%(issue_id)s/issue-title' % f_args |
|
163 | 158 | to_return = { |
|
164 | 159 | 'id': data['local_id'], |
|
165 | 160 | 'resource_url': data['resource_uri'], |
|
166 | 161 | 'web_url': web_url |
|
167 | 162 | } |
|
168 | 163 | return to_return |
@@ -1,79 +1,74 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import logging |
|
23 | 18 | |
|
24 | 19 | from requests.exceptions import HTTPError, ConnectionError |
|
25 | 20 | from camplight import Request, Campfire |
|
26 | 21 | from camplight.exceptions import CamplightException |
|
27 | 22 | |
|
28 | 23 | from appenlight.models.integrations import (IntegrationBase, |
|
29 | 24 | IntegrationException) |
|
30 | 25 | |
|
31 | 26 | _ = str |
|
32 | 27 | |
|
33 | 28 | log = logging.getLogger(__name__) |
|
34 | 29 | |
|
35 | 30 | |
|
36 | 31 | class NotFoundException(Exception): |
|
37 | 32 | pass |
|
38 | 33 | |
|
39 | 34 | |
|
40 | 35 | class CampfireIntegration(IntegrationBase): |
|
41 | 36 | __mapper_args__ = { |
|
42 | 37 | 'polymorphic_identity': 'campfire' |
|
43 | 38 | } |
|
44 | 39 | front_visible = False |
|
45 | 40 | as_alert_channel = True |
|
46 | 41 | supports_report_alerting = True |
|
47 | 42 | action_notification = True |
|
48 | 43 | integration_action = 'Message via Campfire' |
|
49 | 44 | |
|
50 | 45 | @classmethod |
|
51 | 46 | def create_client(cls, api_token, account): |
|
52 | 47 | client = CampfireClient(api_token, account) |
|
53 | 48 | return client |
|
54 | 49 | |
|
55 | 50 | |
|
56 | 51 | class CampfireClient(object): |
|
57 | 52 | def __init__(self, api_token, account): |
|
58 | 53 | request = Request('https://%s.campfirenow.com' % account, api_token) |
|
59 | 54 | self.campfire = Campfire(request) |
|
60 | 55 | |
|
61 | 56 | def get_account(self): |
|
62 | 57 | try: |
|
63 | 58 | return self.campfire.account() |
|
64 | 59 | except (HTTPError, CamplightException) as e: |
|
65 | 60 | raise IntegrationException(str(e)) |
|
66 | 61 | |
|
67 | 62 | def get_rooms(self): |
|
68 | 63 | try: |
|
69 | 64 | return self.campfire.rooms() |
|
70 | 65 | except (HTTPError, CamplightException) as e: |
|
71 | 66 | raise IntegrationException(str(e)) |
|
72 | 67 | |
|
73 | 68 | def speak_to_room(self, room, message, sound='RIMSHOT'): |
|
74 | 69 | try: |
|
75 | 70 | room = self.campfire.room(room) |
|
76 | 71 | room.join() |
|
77 | 72 | room.speak(message, type_='TextMessage') |
|
78 | 73 | except (HTTPError, CamplightException, ConnectionError) as e: |
|
79 | 74 | raise IntegrationException(str(e)) |
@@ -1,87 +1,82 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import logging |
|
23 | 18 | |
|
24 | 19 | import requests |
|
25 | 20 | from requests.auth import HTTPBasicAuth |
|
26 | 21 | import simplejson as json |
|
27 | 22 | |
|
28 | 23 | from appenlight.models.integrations import (IntegrationBase, |
|
29 | 24 | IntegrationException) |
|
30 | 25 | |
|
31 | 26 | _ = str |
|
32 | 27 | |
|
33 | 28 | log = logging.getLogger(__name__) |
|
34 | 29 | |
|
35 | 30 | |
|
36 | 31 | class NotFoundException(Exception): |
|
37 | 32 | pass |
|
38 | 33 | |
|
39 | 34 | |
|
40 | 35 | class FlowdockIntegration(IntegrationBase): |
|
41 | 36 | __mapper_args__ = { |
|
42 | 37 | 'polymorphic_identity': 'flowdock' |
|
43 | 38 | } |
|
44 | 39 | front_visible = False |
|
45 | 40 | as_alert_channel = True |
|
46 | 41 | supports_report_alerting = True |
|
47 | 42 | action_notification = True |
|
48 | 43 | integration_action = 'Message via Flowdock' |
|
49 | 44 | |
|
50 | 45 | @classmethod |
|
51 | 46 | def create_client(cls, api_token): |
|
52 | 47 | client = FlowdockClient(api_token) |
|
53 | 48 | return client |
|
54 | 49 | |
|
55 | 50 | |
|
56 | 51 | class FlowdockClient(object): |
|
57 | 52 | def __init__(self, api_token): |
|
58 | 53 | self.auth = HTTPBasicAuth(api_token, '') |
|
59 | 54 | self.api_token = api_token |
|
60 | 55 | self.api_url = 'https://api.flowdock.com/v1/messages' |
|
61 | 56 | |
|
62 | 57 | def make_request(self, url, method='get', data=None): |
|
63 | 58 | headers = { |
|
64 | 59 | 'Content-Type': 'application/json', |
|
65 | 60 | 'User-Agent': 'appenlight-flowdock', |
|
66 | 61 | } |
|
67 | 62 | try: |
|
68 | 63 | if data: |
|
69 | 64 | data = json.dumps(data) |
|
70 | 65 | resp = getattr(requests, method)(url, data=data, headers=headers, |
|
71 | 66 | timeout=10) |
|
72 | 67 | except Exception as e: |
|
73 | 68 | raise IntegrationException( |
|
74 | 69 | _('Error communicating with Flowdock: %s') % (e,)) |
|
75 | 70 | if resp.status_code > 299: |
|
76 | 71 | raise IntegrationException(resp.text) |
|
77 | 72 | return resp |
|
78 | 73 | |
|
79 | 74 | def send_to_chat(self, payload): |
|
80 | 75 | url = '%(api_url)s/chat/%(api_token)s' % {'api_url': self.api_url, |
|
81 | 76 | 'api_token': self.api_token} |
|
82 | 77 | return self.make_request(url, method='post', data=payload).json() |
|
83 | 78 | |
|
84 | 79 | def send_to_inbox(self, payload): |
|
85 | 80 | f_args = {'api_url': self.api_url, 'api_token': self.api_token} |
|
86 | 81 | url = '%(api_url)s/team_inbox/%(api_token)s' % f_args |
|
87 | 82 | return self.make_request(url, method='post', data=payload).json() |
@@ -1,161 +1,156 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import json |
|
23 | 18 | import requests |
|
24 | 19 | |
|
25 | 20 | from . import IntegrationBase, IntegrationException |
|
26 | 21 | |
|
27 | 22 | _ = str |
|
28 | 23 | |
|
29 | 24 | |
|
30 | 25 | class GithubAuthException(Exception): |
|
31 | 26 | pass |
|
32 | 27 | |
|
33 | 28 | |
|
34 | 29 | class GithubIntegration(IntegrationBase): |
|
35 | 30 | __mapper_args__ = { |
|
36 | 31 | 'polymorphic_identity': 'github' |
|
37 | 32 | } |
|
38 | 33 | front_visible = True |
|
39 | 34 | as_alert_channel = False |
|
40 | 35 | supports_report_alerting = False |
|
41 | 36 | action_notification = True |
|
42 | 37 | integration_action = 'Add issue to Github' |
|
43 | 38 | |
|
44 | 39 | @classmethod |
|
45 | 40 | def create_client(cls, request, user_name=None, repo_name=None): |
|
46 | 41 | """ |
|
47 | 42 | Creates REST client that can authenticate to specific repo |
|
48 | 43 | uses auth tokens for current request user |
|
49 | 44 | """ |
|
50 | 45 | token = None |
|
51 | 46 | secret = None |
|
52 | 47 | for identity in request.user.external_identities: |
|
53 | 48 | if identity.provider_name == 'github': |
|
54 | 49 | token = identity.access_token |
|
55 | 50 | secret = identity.token_secret |
|
56 | 51 | break |
|
57 | 52 | if not token: |
|
58 | 53 | raise IntegrationException( |
|
59 | 54 | 'No valid auth token present for this service') |
|
60 | 55 | client = GithubClient(token=token, owner=user_name, name=repo_name) |
|
61 | 56 | return client |
|
62 | 57 | |
|
63 | 58 | |
|
64 | 59 | class GithubClient(object): |
|
65 | 60 | api_url = 'https://api.github.com' |
|
66 | 61 | repo_type = 'github' |
|
67 | 62 | |
|
68 | 63 | def __init__(self, token, owner, name): |
|
69 | 64 | self.access_token = token |
|
70 | 65 | self.owner = owner |
|
71 | 66 | self.name = name |
|
72 | 67 | |
|
73 | 68 | def make_request(self, url, method='get', data=None, headers=None): |
|
74 | 69 | req_headers = {'User-Agent': 'appenlight', |
|
75 | 70 | 'Content-Type': 'application/json', |
|
76 | 71 | 'Authorization': 'token %s' % self.access_token} |
|
77 | 72 | try: |
|
78 | 73 | if data: |
|
79 | 74 | data = json.dumps(data) |
|
80 | 75 | resp = getattr(requests, method)(url, data=data, |
|
81 | 76 | headers=req_headers, |
|
82 | 77 | timeout=10) |
|
83 | 78 | except Exception as e: |
|
84 | 79 | msg = 'Error communicating with Github: %s' |
|
85 | 80 | raise IntegrationException(_(msg) % (e,)) |
|
86 | 81 | |
|
87 | 82 | if resp.status_code == 404: |
|
88 | 83 | msg = 'User or repo name are incorrect' |
|
89 | 84 | raise IntegrationException(_(msg)) |
|
90 | 85 | if resp.status_code == 401: |
|
91 | 86 | msg = 'You are not authorized to access this repo' |
|
92 | 87 | raise IntegrationException(_(msg)) |
|
93 | 88 | elif resp.status_code not in [200, 201]: |
|
94 | 89 | msg = 'Github response_code: %s' |
|
95 | 90 | raise IntegrationException(_(msg) % resp.status_code) |
|
96 | 91 | try: |
|
97 | 92 | return resp.json() |
|
98 | 93 | except Exception as e: |
|
99 | 94 | msg = 'Error decoding response from Github: %s' |
|
100 | 95 | raise IntegrationException(_(msg) % (e,)) |
|
101 | 96 | |
|
102 | 97 | def get_statuses(self): |
|
103 | 98 | """Gets list of possible item statuses""" |
|
104 | 99 | url = '%(api_url)s/repos/%(owner)s/%(name)s/labels' % { |
|
105 | 100 | 'api_url': self.api_url, |
|
106 | 101 | 'owner': self.owner, |
|
107 | 102 | 'name': self.name} |
|
108 | 103 | |
|
109 | 104 | data = self.make_request(url) |
|
110 | 105 | |
|
111 | 106 | statuses = [] |
|
112 | 107 | for status in data: |
|
113 | 108 | statuses.append(status['name']) |
|
114 | 109 | return statuses |
|
115 | 110 | |
|
116 | 111 | def get_repo(self): |
|
117 | 112 | """Gets list of possible item statuses""" |
|
118 | 113 | url = '%(api_url)s/repos/%(owner)s/%(name)s' % { |
|
119 | 114 | 'api_url': self.api_url, |
|
120 | 115 | 'owner': self.owner, |
|
121 | 116 | 'name': self.name} |
|
122 | 117 | |
|
123 | 118 | data = self.make_request(url) |
|
124 | 119 | return data |
|
125 | 120 | |
|
126 | 121 | def get_assignees(self): |
|
127 | 122 | """Gets list of possible assignees""" |
|
128 | 123 | url = '%(api_url)s/repos/%(owner)s/%(name)s/collaborators' % { |
|
129 | 124 | 'api_url': self.api_url, |
|
130 | 125 | 'owner': self.owner, |
|
131 | 126 | 'name': self.name} |
|
132 | 127 | data = self.make_request(url) |
|
133 | 128 | results = [] |
|
134 | 129 | for entry in data: |
|
135 | 130 | results.append({"user": entry['login'], |
|
136 | 131 | "name": entry.get('name')}) |
|
137 | 132 | return results |
|
138 | 133 | |
|
139 | 134 | def create_issue(self, form_data): |
|
140 | 135 | """ |
|
141 | 136 | Make a REST call to create issue in Github's issue tracker |
|
142 | 137 | """ |
|
143 | 138 | url = '%(api_url)s/repos/%(owner)s/%(name)s/issues' % { |
|
144 | 139 | 'api_url': self.api_url, |
|
145 | 140 | 'owner': self.owner, |
|
146 | 141 | 'name': self.name} |
|
147 | 142 | |
|
148 | 143 | payload = { |
|
149 | 144 | "title": form_data['title'], |
|
150 | 145 | "body": form_data['content'], |
|
151 | 146 | "labels": [], |
|
152 | 147 | "assignee": form_data['responsible'] |
|
153 | 148 | } |
|
154 | 149 | payload['labels'].extend(form_data['kind']) |
|
155 | 150 | data = self.make_request(url, 'post', data=payload) |
|
156 | 151 | to_return = { |
|
157 | 152 | 'id': data['number'], |
|
158 | 153 | 'resource_url': data['url'], |
|
159 | 154 | 'web_url': data['html_url'] |
|
160 | 155 | } |
|
161 | 156 | return to_return |
@@ -1,88 +1,83 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import logging |
|
23 | 18 | |
|
24 | 19 | import requests |
|
25 | 20 | |
|
26 | 21 | from . import IntegrationBase, IntegrationException |
|
27 | 22 | |
|
28 | 23 | _ = str |
|
29 | 24 | |
|
30 | 25 | log = logging.getLogger(__name__) |
|
31 | 26 | |
|
32 | 27 | |
|
33 | 28 | class NotFoundException(Exception): |
|
34 | 29 | pass |
|
35 | 30 | |
|
36 | 31 | |
|
37 | 32 | class HipchatIntegration(IntegrationBase): |
|
38 | 33 | __mapper_args__ = { |
|
39 | 34 | 'polymorphic_identity': 'hipchat' |
|
40 | 35 | } |
|
41 | 36 | front_visible = False |
|
42 | 37 | as_alert_channel = True |
|
43 | 38 | supports_report_alerting = True |
|
44 | 39 | action_notification = True |
|
45 | 40 | integration_action = 'Message via Hipchat' |
|
46 | 41 | |
|
47 | 42 | @classmethod |
|
48 | 43 | def create_client(cls, api_token): |
|
49 | 44 | client = HipchatClient(api_token) |
|
50 | 45 | return client |
|
51 | 46 | |
|
52 | 47 | |
|
53 | 48 | class HipchatClient(object): |
|
54 | 49 | def __init__(self, api_token): |
|
55 | 50 | self.api_token = api_token |
|
56 | 51 | self.api_url = 'https://api.hipchat.com/v1' |
|
57 | 52 | |
|
58 | 53 | def make_request(self, endpoint, method='get', data=None): |
|
59 | 54 | headers = { |
|
60 | 55 | 'User-Agent': 'appenlight-hipchat', |
|
61 | 56 | } |
|
62 | 57 | url = '%s%s' % (self.api_url, endpoint) |
|
63 | 58 | params = { |
|
64 | 59 | 'format': 'json', |
|
65 | 60 | 'auth_token': self.api_token |
|
66 | 61 | } |
|
67 | 62 | try: |
|
68 | 63 | resp = getattr(requests, method)(url, data=data, headers=headers, |
|
69 | 64 | params=params, |
|
70 | 65 | timeout=3) |
|
71 | 66 | except Exception as e: |
|
72 | 67 | msg = 'Error communicating with Hipchat: %s' |
|
73 | 68 | raise IntegrationException(_(msg) % (e,)) |
|
74 | 69 | if resp.status_code == 404: |
|
75 | 70 | msg = 'Error communicating with Hipchat - Room not found' |
|
76 | 71 | raise IntegrationException(msg) |
|
77 | 72 | elif resp.status_code != requests.codes.ok: |
|
78 | 73 | msg = 'Error communicating with Hipchat - status code: %s' |
|
79 | 74 | raise IntegrationException(msg % resp.status_code) |
|
80 | 75 | return resp |
|
81 | 76 | |
|
82 | 77 | def get_rooms(self): |
|
83 | 78 | # not used with notification api token |
|
84 | 79 | return self.make_request('/rooms/list') |
|
85 | 80 | |
|
86 | 81 | def send(self, payload): |
|
87 | 82 | return self.make_request('/rooms/message', method='post', |
|
88 | 83 | data=payload).json() |
@@ -1,141 +1,136 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import jira |
|
23 | 18 | from appenlight.models.integrations import (IntegrationBase, |
|
24 | 19 | IntegrationException) |
|
25 | 20 | |
|
26 | 21 | _ = str |
|
27 | 22 | |
|
28 | 23 | |
|
29 | 24 | class NotFoundException(Exception): |
|
30 | 25 | pass |
|
31 | 26 | |
|
32 | 27 | |
|
33 | 28 | class JiraIntegration(IntegrationBase): |
|
34 | 29 | __mapper_args__ = { |
|
35 | 30 | 'polymorphic_identity': 'jira' |
|
36 | 31 | } |
|
37 | 32 | front_visible = True |
|
38 | 33 | as_alert_channel = False |
|
39 | 34 | supports_report_alerting = False |
|
40 | 35 | action_notification = True |
|
41 | 36 | integration_action = 'Add issue to Jira' |
|
42 | 37 | |
|
43 | 38 | |
|
44 | 39 | class JiraClient(object): |
|
45 | 40 | def __init__(self, user_name, password, host_name, project, request=None): |
|
46 | 41 | self.user_name = user_name |
|
47 | 42 | self.password = password |
|
48 | 43 | self.host_name = host_name |
|
49 | 44 | self.project = project |
|
50 | 45 | self.request = request |
|
51 | 46 | try: |
|
52 | 47 | self.client = jira.client.JIRA(options={'server': host_name}, |
|
53 | 48 | basic_auth=(user_name, password)) |
|
54 | 49 | except jira.JIRAError as e: |
|
55 | 50 | raise IntegrationException( |
|
56 | 51 | 'Communication problem: HTTP_STATUS:%s, URL:%s ' % ( |
|
57 | 52 | e.status_code, e.url)) |
|
58 | 53 | |
|
59 | 54 | def get_projects(self): |
|
60 | 55 | projects = self.client.projects() |
|
61 | 56 | return projects |
|
62 | 57 | |
|
63 | 58 | def get_assignees(self, request): |
|
64 | 59 | """Gets list of possible assignees""" |
|
65 | 60 | cache_region = request.registry.cache_regions.redis_sec_30 |
|
66 | 61 | @cache_region.cache_on_arguments('JiraClient.get_assignees') |
|
67 | 62 | def cached(project_name): |
|
68 | 63 | users = self.client.search_assignable_users_for_issues( |
|
69 | 64 | None, project=project_name) |
|
70 | 65 | results = [] |
|
71 | 66 | for user in users: |
|
72 | 67 | results.append({"id": user.name, "name": user.displayName}) |
|
73 | 68 | return results |
|
74 | 69 | return cached(self.project) |
|
75 | 70 | |
|
76 | 71 | def get_issue_types(self, request): |
|
77 | 72 | metadata = self.get_metadata(request) |
|
78 | 73 | assignees = self.get_assignees(request) |
|
79 | 74 | parsed_metadata = [] |
|
80 | 75 | for entry in metadata['projects'][0]['issuetypes']: |
|
81 | 76 | issue = {"name": entry['name'], |
|
82 | 77 | "id": entry['id'], |
|
83 | 78 | "fields": []} |
|
84 | 79 | for i_id, field_i in entry['fields'].items(): |
|
85 | 80 | field = { |
|
86 | 81 | "name": field_i['name'], |
|
87 | 82 | "id": i_id, |
|
88 | 83 | "required": field_i['required'], |
|
89 | 84 | "values": [], |
|
90 | 85 | "type": field_i['schema'].get('type') |
|
91 | 86 | } |
|
92 | 87 | if field_i.get('allowedValues'): |
|
93 | 88 | field['values'] = [] |
|
94 | 89 | for i in field_i['allowedValues']: |
|
95 | 90 | field['values'].append( |
|
96 | 91 | {'id': i['id'], |
|
97 | 92 | 'name': i.get('name', i.get('value', '')) |
|
98 | 93 | }) |
|
99 | 94 | if field['id'] == 'assignee': |
|
100 | 95 | field['values'] = assignees |
|
101 | 96 | issue['fields'].append(field) |
|
102 | 97 | parsed_metadata.append(issue) |
|
103 | 98 | return parsed_metadata |
|
104 | 99 | |
|
105 | 100 | def get_metadata(self, request): |
|
106 | 101 | # cache_region = request.registry.cache_regions.redis_sec_30 |
|
107 | 102 | # @cache_region.cache_on_arguments('JiraClient.get_metadata') |
|
108 | 103 | def cached(project_name): |
|
109 | 104 | return self.client.createmeta( |
|
110 | 105 | projectKeys=project_name, expand='projects.issuetypes.fields') |
|
111 | 106 | return cached(self.project) |
|
112 | 107 | |
|
113 | 108 | def create_issue(self, form_data, request): |
|
114 | 109 | issue_types = self.get_issue_types(request) |
|
115 | 110 | payload = { |
|
116 | 111 | 'project': {'key': form_data['project']}, |
|
117 | 112 | 'summary': form_data['title'], |
|
118 | 113 | 'description': form_data['content'], |
|
119 | 114 | 'issuetype': {'id': form_data['issue_type']}, |
|
120 | 115 | "priority": {'id': form_data['priority']}, |
|
121 | 116 | "assignee": {'name': form_data['responsible']}, |
|
122 | 117 | } |
|
123 | 118 | for issue_type in issue_types: |
|
124 | 119 | if issue_type['id'] == form_data['issue_type']: |
|
125 | 120 | for field in issue_type['fields']: |
|
126 | 121 | # set some defaults for other required fields |
|
127 | 122 | if field == 'reporter': |
|
128 | 123 | payload["reporter"] = {'id': self.user_name} |
|
129 | 124 | if field['required'] and field['id'] not in payload: |
|
130 | 125 | if field['type'] == 'array': |
|
131 | 126 | payload[field['id']] = [field['values'][0], ] |
|
132 | 127 | elif field['type'] == 'string': |
|
133 | 128 | payload[field['id']] = '' |
|
134 | 129 | new_issue = self.client.create_issue(fields=payload) |
|
135 | 130 | web_url = self.host_name + '/browse/' + new_issue.key |
|
136 | 131 | to_return = { |
|
137 | 132 | 'id': new_issue.id, |
|
138 | 133 | 'resource_url': new_issue.self, |
|
139 | 134 | 'web_url': web_url |
|
140 | 135 | } |
|
141 | 136 | return to_return |
@@ -1,79 +1,74 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import logging |
|
23 | 18 | |
|
24 | 19 | import requests |
|
25 | 20 | |
|
26 | 21 | from appenlight.models.integrations import (IntegrationBase, |
|
27 | 22 | IntegrationException) |
|
28 | 23 | from appenlight.lib.ext_json import json |
|
29 | 24 | |
|
30 | 25 | _ = str |
|
31 | 26 | |
|
32 | 27 | log = logging.getLogger(__name__) |
|
33 | 28 | |
|
34 | 29 | |
|
35 | 30 | class NotFoundException(Exception): |
|
36 | 31 | pass |
|
37 | 32 | |
|
38 | 33 | |
|
39 | 34 | class SlackIntegration(IntegrationBase): |
|
40 | 35 | __mapper_args__ = { |
|
41 | 36 | 'polymorphic_identity': 'slack' |
|
42 | 37 | } |
|
43 | 38 | front_visible = False |
|
44 | 39 | as_alert_channel = True |
|
45 | 40 | supports_report_alerting = True |
|
46 | 41 | action_notification = True |
|
47 | 42 | integration_action = 'Message via Slack' |
|
48 | 43 | |
|
49 | 44 | @classmethod |
|
50 | 45 | def create_client(cls, api_token): |
|
51 | 46 | client = SlackClient(api_token) |
|
52 | 47 | return client |
|
53 | 48 | |
|
54 | 49 | |
|
55 | 50 | class SlackClient(object): |
|
56 | 51 | def __init__(self, api_url): |
|
57 | 52 | self.api_url = api_url |
|
58 | 53 | |
|
59 | 54 | def make_request(self, data=None): |
|
60 | 55 | headers = { |
|
61 | 56 | 'User-Agent': 'appenlight-slack', |
|
62 | 57 | 'Content-Type': 'application/json' |
|
63 | 58 | } |
|
64 | 59 | try: |
|
65 | 60 | resp = getattr(requests, 'post')(self.api_url, |
|
66 | 61 | data=json.dumps(data), |
|
67 | 62 | headers=headers, |
|
68 | 63 | timeout=3) |
|
69 | 64 | except Exception as e: |
|
70 | 65 | raise IntegrationException( |
|
71 | 66 | _('Error communicating with Slack: %s') % (e,)) |
|
72 | 67 | if resp.status_code != requests.codes.ok: |
|
73 | 68 | msg = 'Error communicating with Slack - status code: %s' |
|
74 | 69 | raise IntegrationException(msg % resp.status_code) |
|
75 | 70 | return resp |
|
76 | 71 | |
|
77 | 72 | def send(self, payload): |
|
78 | 73 | return self.make_request('/rooms/message', method='post', |
|
79 | 74 | data=payload).json() |
@@ -1,143 +1,138 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import logging |
|
23 | 18 | |
|
24 | 19 | import requests |
|
25 | 20 | |
|
26 | 21 | from appenlight.models.integrations import (IntegrationBase, |
|
27 | 22 | IntegrationException) |
|
28 | 23 | from appenlight.models.alert_channel import AlertChannel |
|
29 | 24 | from appenlight.lib.ext_json import json |
|
30 | 25 | |
|
31 | 26 | _ = str |
|
32 | 27 | |
|
33 | 28 | log = logging.getLogger(__name__) |
|
34 | 29 | |
|
35 | 30 | |
|
36 | 31 | class NotFoundException(Exception): |
|
37 | 32 | pass |
|
38 | 33 | |
|
39 | 34 | |
|
40 | 35 | class WebhooksIntegration(IntegrationBase): |
|
41 | 36 | __mapper_args__ = { |
|
42 | 37 | 'polymorphic_identity': 'webhooks' |
|
43 | 38 | } |
|
44 | 39 | front_visible = False |
|
45 | 40 | as_alert_channel = True |
|
46 | 41 | supports_report_alerting = True |
|
47 | 42 | action_notification = True |
|
48 | 43 | integration_action = 'Message via Webhooks' |
|
49 | 44 | |
|
50 | 45 | @classmethod |
|
51 | 46 | def create_client(cls, url): |
|
52 | 47 | client = WebhooksClient(url) |
|
53 | 48 | return client |
|
54 | 49 | |
|
55 | 50 | |
|
56 | 51 | class WebhooksClient(object): |
|
57 | 52 | def __init__(self, url): |
|
58 | 53 | self.api_url = url |
|
59 | 54 | |
|
60 | 55 | def make_request(self, url, method='get', data=None): |
|
61 | 56 | headers = { |
|
62 | 57 | 'Content-Type': 'application/json', |
|
63 | 58 | 'User-Agent': 'appenlight-webhooks', |
|
64 | 59 | } |
|
65 | 60 | try: |
|
66 | 61 | if data: |
|
67 | 62 | data = json.dumps(data) |
|
68 | 63 | resp = getattr(requests, method)(url, data=data, headers=headers, |
|
69 | 64 | timeout=3) |
|
70 | 65 | except Exception as e: |
|
71 | 66 | raise IntegrationException( |
|
72 | 67 | _('Error communicating with Webhooks: {}').format(e)) |
|
73 | 68 | if resp.status_code > 299: |
|
74 | 69 | raise IntegrationException( |
|
75 | 70 | 'Error communicating with Webhooks - status code: {}'.format( |
|
76 | 71 | resp.status_code)) |
|
77 | 72 | return resp |
|
78 | 73 | |
|
79 | 74 | def send_to_hook(self, payload): |
|
80 | 75 | return self.make_request(self.api_url, method='post', |
|
81 | 76 | data=payload).json() |
|
82 | 77 | |
|
83 | 78 | |
|
84 | 79 | class WebhooksAlertChannel(AlertChannel): |
|
85 | 80 | __mapper_args__ = { |
|
86 | 81 | 'polymorphic_identity': 'webhooks' |
|
87 | 82 | } |
|
88 | 83 | |
|
89 | 84 | def notify_reports(self, **kwargs): |
|
90 | 85 | """ |
|
91 | 86 | Notify user of individual reports |
|
92 | 87 | |
|
93 | 88 | kwargs: |
|
94 | 89 | application: application that the event applies for, |
|
95 | 90 | user: user that should be notified |
|
96 | 91 | request: request object |
|
97 | 92 | since_when: reports are newer than this time value, |
|
98 | 93 | reports: list of reports to render |
|
99 | 94 | |
|
100 | 95 | """ |
|
101 | 96 | template_vars = self.get_notification_basic_vars(kwargs) |
|
102 | 97 | payload = [] |
|
103 | 98 | include_keys = ('id', 'http_status', 'report_type', 'resource_name', |
|
104 | 99 | 'front_url', 'resource_id', 'error', 'url_path', |
|
105 | 100 | 'tags', 'duration') |
|
106 | 101 | |
|
107 | 102 | for occurences, report in kwargs['reports']: |
|
108 | 103 | r_dict = report.last_report_ref.get_dict(kwargs['request'], |
|
109 | 104 | include_keys=include_keys) |
|
110 | 105 | r_dict['group']['occurences'] = occurences |
|
111 | 106 | payload.append(r_dict) |
|
112 | 107 | client = WebhooksIntegration.create_client( |
|
113 | 108 | self.integration.config['reports_webhook']) |
|
114 | 109 | client.send_to_hook(payload) |
|
115 | 110 | |
|
116 | 111 | def notify_alert(self, **kwargs): |
|
117 | 112 | """ |
|
118 | 113 | Notify user of report or uptime threshold events based on events alert type |
|
119 | 114 | |
|
120 | 115 | Kwargs: |
|
121 | 116 | application: application that the event applies for, |
|
122 | 117 | event: event that is notified, |
|
123 | 118 | user: user that should be notified |
|
124 | 119 | request: request object |
|
125 | 120 | |
|
126 | 121 | """ |
|
127 | 122 | payload = { |
|
128 | 123 | 'alert_action': kwargs['event'].unified_alert_action(), |
|
129 | 124 | 'alert_name': kwargs['event'].unified_alert_name(), |
|
130 | 125 | 'event_time': kwargs['event'].end_date or kwargs[ |
|
131 | 126 | 'event'].start_date, |
|
132 | 127 | 'resource_name': None, |
|
133 | 128 | 'resource_id': None |
|
134 | 129 | } |
|
135 | 130 | if kwargs['event'].values and kwargs['event'].values.get('reports'): |
|
136 | 131 | payload['reports'] = kwargs['event'].values.get('reports', []) |
|
137 | 132 | if 'application' in kwargs: |
|
138 | 133 | payload['resource_name'] = kwargs['application'].resource_name |
|
139 | 134 | payload['resource_id'] = kwargs['application'].resource_id |
|
140 | 135 | |
|
141 | 136 | client = WebhooksIntegration.create_client( |
|
142 | 137 | self.integration.config['alerts_webhook']) |
|
143 | 138 | client.send_to_hook(payload) |
@@ -1,135 +1,130 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import sqlalchemy as sa |
|
23 | 18 | import logging |
|
24 | 19 | import hashlib |
|
25 | 20 | |
|
26 | 21 | from datetime import datetime |
|
27 | 22 | from appenlight.models import Base |
|
28 | 23 | from appenlight.lib.utils import convert_es_type |
|
29 | 24 | from appenlight.lib.enums import LogLevel |
|
30 | 25 | from sqlalchemy.dialects.postgresql import JSON |
|
31 | 26 | from ziggurat_foundations.models.base import BaseModel |
|
32 | 27 | |
|
33 | 28 | log = logging.getLogger(__name__) |
|
34 | 29 | |
|
35 | 30 | |
|
36 | 31 | class Log(Base, BaseModel): |
|
37 | 32 | __tablename__ = 'logs' |
|
38 | 33 | __table_args__ = {'implicit_returning': False} |
|
39 | 34 | |
|
40 | 35 | log_id = sa.Column(sa.BigInteger(), nullable=False, primary_key=True) |
|
41 | 36 | resource_id = sa.Column(sa.Integer(), |
|
42 | 37 | sa.ForeignKey('applications.resource_id', |
|
43 | 38 | onupdate='CASCADE', |
|
44 | 39 | ondelete='CASCADE'), |
|
45 | 40 | nullable=False, |
|
46 | 41 | index=True) |
|
47 | 42 | log_level = sa.Column(sa.Unicode, nullable=False, index=True, |
|
48 | 43 | default='INFO') |
|
49 | 44 | message = sa.Column(sa.UnicodeText(), default='') |
|
50 | 45 | timestamp = sa.Column(sa.DateTime(), default=datetime.utcnow, |
|
51 | 46 | server_default=sa.func.now()) |
|
52 | 47 | request_id = sa.Column(sa.Unicode()) |
|
53 | 48 | namespace = sa.Column(sa.Unicode()) |
|
54 | 49 | primary_key = sa.Column(sa.Unicode()) |
|
55 | 50 | |
|
56 | 51 | tags = sa.Column(JSON(), default={}) |
|
57 | 52 | permanent = sa.Column(sa.Boolean(), nullable=False, default=False) |
|
58 | 53 | |
|
59 | 54 | def __str__(self): |
|
60 | 55 | return self.__unicode__().encode('utf8') |
|
61 | 56 | |
|
62 | 57 | def __unicode__(self): |
|
63 | 58 | return '<Log id:%s, lv:%s, ns:%s >' % ( |
|
64 | 59 | self.log_id, self.log_level, self.namespace) |
|
65 | 60 | |
|
66 | 61 | def set_data(self, data, resource): |
|
67 | 62 | level = data.get('log_level').upper() |
|
68 | 63 | self.log_level = getattr(LogLevel, level, LogLevel.UNKNOWN) |
|
69 | 64 | self.message = data.get('message', '') |
|
70 | 65 | server_name = data.get('server', '').lower() or 'unknown' |
|
71 | 66 | self.tags = { |
|
72 | 67 | 'server_name': server_name |
|
73 | 68 | } |
|
74 | 69 | if data.get('tags'): |
|
75 | 70 | for tag_tuple in data['tags']: |
|
76 | 71 | self.tags[tag_tuple[0]] = tag_tuple[1] |
|
77 | 72 | self.timestamp = data['date'] |
|
78 | 73 | r_id = data.get('request_id', '') |
|
79 | 74 | if not r_id: |
|
80 | 75 | r_id = '' |
|
81 | 76 | self.request_id = r_id.replace('-', '') |
|
82 | 77 | self.resource_id = resource.resource_id |
|
83 | 78 | self.namespace = data.get('namespace') or '' |
|
84 | 79 | self.permanent = data.get('permanent') |
|
85 | 80 | self.primary_key = data.get('primary_key') |
|
86 | 81 | if self.primary_key is not None: |
|
87 | 82 | self.tags['appenlight_primary_key'] = self.primary_key |
|
88 | 83 | |
|
89 | 84 | def get_dict(self): |
|
90 | 85 | instance_dict = super(Log, self).get_dict() |
|
91 | 86 | instance_dict['log_level'] = LogLevel.key_from_value(self.log_level) |
|
92 | 87 | instance_dict['resource_name'] = self.application.resource_name |
|
93 | 88 | return instance_dict |
|
94 | 89 | |
|
95 | 90 | @property |
|
96 | 91 | def delete_hash(self): |
|
97 | 92 | if not self.primary_key: |
|
98 | 93 | return None |
|
99 | 94 | |
|
100 | 95 | to_hash = '{}_{}_{}'.format(self.resource_id, self.primary_key, |
|
101 | 96 | self.namespace) |
|
102 | 97 | return hashlib.sha1(to_hash.encode('utf8')).hexdigest() |
|
103 | 98 | |
|
104 | 99 | def es_doc(self): |
|
105 | 100 | tags = {} |
|
106 | 101 | tag_list = [] |
|
107 | 102 | for name, value in self.tags.items(): |
|
108 | 103 | # replace dot in indexed tag name |
|
109 | 104 | name = name.replace('.', '_') |
|
110 | 105 | tag_list.append(name) |
|
111 | 106 | tags[name] = { |
|
112 | 107 | "values": convert_es_type(value), |
|
113 | 108 | "numeric_values": value if ( |
|
114 | 109 | isinstance(value, (int, float)) and |
|
115 | 110 | not isinstance(value, bool)) else None |
|
116 | 111 | } |
|
117 | 112 | return { |
|
118 | 113 | 'pg_id': str(self.log_id), |
|
119 | 114 | 'delete_hash': self.delete_hash, |
|
120 | 115 | 'resource_id': self.resource_id, |
|
121 | 116 | 'request_id': self.request_id, |
|
122 | 117 | 'log_level': LogLevel.key_from_value(self.log_level), |
|
123 | 118 | 'timestamp': self.timestamp, |
|
124 | 119 | 'message': self.message if self.message else '', |
|
125 | 120 | 'namespace': self.namespace if self.namespace else '', |
|
126 | 121 | 'tags': tags, |
|
127 | 122 | 'tag_list': tag_list |
|
128 | 123 | } |
|
129 | 124 | |
|
130 | 125 | @property |
|
131 | 126 | def partition_id(self): |
|
132 | 127 | if self.permanent: |
|
133 | 128 | return 'rcae_l_%s' % self.timestamp.strftime('%Y_%m') |
|
134 | 129 | else: |
|
135 | 130 | return 'rcae_l_%s' % self.timestamp.strftime('%Y_%m_%d') |
@@ -1,69 +1,64 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | from datetime import datetime |
|
23 | 18 | |
|
24 | 19 | import sqlalchemy as sa |
|
25 | 20 | from sqlalchemy.dialects.postgresql import JSON |
|
26 | 21 | |
|
27 | 22 | from ziggurat_foundations.models.base import BaseModel |
|
28 | 23 | from appenlight.lib.utils import convert_es_type |
|
29 | 24 | from appenlight.models import Base |
|
30 | 25 | |
|
31 | 26 | |
|
32 | 27 | class Metric(Base, BaseModel): |
|
33 | 28 | __tablename__ = 'metrics' |
|
34 | 29 | __table_args__ = {'implicit_returning': False} |
|
35 | 30 | |
|
36 | 31 | pkey = sa.Column(sa.BigInteger(), primary_key=True) |
|
37 | 32 | resource_id = sa.Column(sa.Integer(), |
|
38 | 33 | sa.ForeignKey('applications.resource_id'), |
|
39 | 34 | nullable=False, primary_key=True) |
|
40 | 35 | timestamp = sa.Column(sa.DateTime(), default=datetime.utcnow, |
|
41 | 36 | server_default=sa.func.now()) |
|
42 | 37 | tags = sa.Column(JSON(), default={}) |
|
43 | 38 | namespace = sa.Column(sa.Unicode(255)) |
|
44 | 39 | |
|
45 | 40 | @property |
|
46 | 41 | def partition_id(self): |
|
47 | 42 | return 'rcae_m_%s' % self.timestamp.strftime('%Y_%m_%d') |
|
48 | 43 | |
|
49 | 44 | def es_doc(self): |
|
50 | 45 | tags = {} |
|
51 | 46 | tag_list = [] |
|
52 | 47 | for name, value in self.tags.items(): |
|
53 | 48 | # replace dot in indexed tag name |
|
54 | 49 | name = name.replace('.', '_') |
|
55 | 50 | tag_list.append(name) |
|
56 | 51 | tags[name] = { |
|
57 | 52 | "values": convert_es_type(value), |
|
58 | 53 | "numeric_values": value if ( |
|
59 | 54 | isinstance(value, (int, float)) and |
|
60 | 55 | not isinstance(value, bool)) else None |
|
61 | 56 | } |
|
62 | 57 | |
|
63 | 58 | return { |
|
64 | 59 | 'resource_id': self.resource_id, |
|
65 | 60 | 'timestamp': self.timestamp, |
|
66 | 61 | 'namespace': self.namespace, |
|
67 | 62 | 'tags': tags, |
|
68 | 63 | 'tag_list': tag_list |
|
69 | 64 | } |
@@ -1,45 +1,40 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import sqlalchemy as sa |
|
23 | 18 | from ziggurat_foundations.models.base import BaseModel |
|
24 | 19 | from sqlalchemy.dialects.postgres import JSON |
|
25 | 20 | |
|
26 | 21 | from . import Base |
|
27 | 22 | |
|
28 | 23 | |
|
29 | 24 | class PluginConfig(Base, BaseModel): |
|
30 | 25 | __tablename__ = 'plugin_configs' |
|
31 | 26 | |
|
32 | 27 | id = sa.Column(sa.Integer, primary_key=True) |
|
33 | 28 | plugin_name = sa.Column(sa.Unicode) |
|
34 | 29 | section = sa.Column(sa.Unicode) |
|
35 | 30 | config = sa.Column(JSON, nullable=False) |
|
36 | 31 | resource_id = sa.Column(sa.Integer(), |
|
37 | 32 | sa.ForeignKey('resources.resource_id', |
|
38 | 33 | onupdate='cascade', |
|
39 | 34 | ondelete='cascade')) |
|
40 | 35 | owner_id = sa.Column(sa.Integer(), |
|
41 | 36 | sa.ForeignKey('users.id', onupdate='cascade', |
|
42 | 37 | ondelete='cascade')) |
|
43 | 38 | |
|
44 | 39 | def __json__(self, request): |
|
45 | 40 | return self.get_dict() |
@@ -1,519 +1,514 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | from datetime import datetime, timedelta |
|
23 | 18 | import math |
|
24 | 19 | import uuid |
|
25 | 20 | import hashlib |
|
26 | 21 | import copy |
|
27 | 22 | import urllib.parse |
|
28 | 23 | import logging |
|
29 | 24 | import sqlalchemy as sa |
|
30 | 25 | |
|
31 | 26 | from appenlight.models import Base, Datastores |
|
32 | 27 | from appenlight.lib.utils.date_utils import convert_date |
|
33 | 28 | from appenlight.lib.utils import convert_es_type |
|
34 | 29 | from appenlight.models.slow_call import SlowCall |
|
35 | 30 | from appenlight.lib.utils import channelstream_request |
|
36 | 31 | from appenlight.lib.enums import ReportType, Language |
|
37 | 32 | from pyramid.threadlocal import get_current_registry, get_current_request |
|
38 | 33 | from sqlalchemy.dialects.postgresql import JSON |
|
39 | 34 | from ziggurat_foundations.models.base import BaseModel |
|
40 | 35 | |
|
41 | 36 | log = logging.getLogger(__name__) |
|
42 | 37 | |
|
43 | 38 | REPORT_TYPE_MATRIX = { |
|
44 | 39 | 'http_status': {"type": 'int', |
|
45 | 40 | "ops": ('eq', 'ne', 'ge', 'le',)}, |
|
46 | 41 | 'group:priority': {"type": 'int', |
|
47 | 42 | "ops": ('eq', 'ne', 'ge', 'le',)}, |
|
48 | 43 | 'duration': {"type": 'float', |
|
49 | 44 | "ops": ('ge', 'le',)}, |
|
50 | 45 | 'url_domain': {"type": 'unicode', |
|
51 | 46 | "ops": ('eq', 'ne', 'startswith', 'endswith', 'contains',)}, |
|
52 | 47 | 'url_path': {"type": 'unicode', |
|
53 | 48 | "ops": ('eq', 'ne', 'startswith', 'endswith', 'contains',)}, |
|
54 | 49 | 'error': {"type": 'unicode', |
|
55 | 50 | "ops": ('eq', 'ne', 'startswith', 'endswith', 'contains',)}, |
|
56 | 51 | 'tags:server_name': {"type": 'unicode', |
|
57 | 52 | "ops": ('eq', 'ne', 'startswith', 'endswith', |
|
58 | 53 | 'contains',)}, |
|
59 | 54 | 'traceback': {"type": 'unicode', |
|
60 | 55 | "ops": ('contains',)}, |
|
61 | 56 | 'group:occurences': {"type": 'int', |
|
62 | 57 | "ops": ('eq', 'ne', 'ge', 'le',)} |
|
63 | 58 | } |
|
64 | 59 | |
|
65 | 60 | |
|
66 | 61 | class Report(Base, BaseModel): |
|
67 | 62 | __tablename__ = 'reports' |
|
68 | 63 | __table_args__ = {'implicit_returning': False} |
|
69 | 64 | |
|
70 | 65 | id = sa.Column(sa.Integer, nullable=False, primary_key=True) |
|
71 | 66 | group_id = sa.Column(sa.BigInteger, |
|
72 | 67 | sa.ForeignKey('reports_groups.id', ondelete='cascade', |
|
73 | 68 | onupdate='cascade')) |
|
74 | 69 | resource_id = sa.Column(sa.Integer(), nullable=False, index=True) |
|
75 | 70 | report_type = sa.Column(sa.Integer(), nullable=False, index=True) |
|
76 | 71 | error = sa.Column(sa.UnicodeText(), index=True) |
|
77 | 72 | extra = sa.Column(JSON(), default={}) |
|
78 | 73 | request = sa.Column(JSON(), nullable=False, default={}) |
|
79 | 74 | ip = sa.Column(sa.String(39), index=True, default='') |
|
80 | 75 | username = sa.Column(sa.Unicode(255), default='') |
|
81 | 76 | user_agent = sa.Column(sa.Unicode(255), default='') |
|
82 | 77 | url = sa.Column(sa.UnicodeText(), index=True) |
|
83 | 78 | request_id = sa.Column(sa.Text()) |
|
84 | 79 | request_stats = sa.Column(JSON(), nullable=False, default={}) |
|
85 | 80 | traceback = sa.Column(JSON(), nullable=False, default=None) |
|
86 | 81 | traceback_hash = sa.Column(sa.Text()) |
|
87 | 82 | start_time = sa.Column(sa.DateTime(), default=datetime.utcnow, |
|
88 | 83 | server_default=sa.func.now()) |
|
89 | 84 | end_time = sa.Column(sa.DateTime()) |
|
90 | 85 | duration = sa.Column(sa.Float, default=0) |
|
91 | 86 | http_status = sa.Column(sa.Integer, index=True) |
|
92 | 87 | url_domain = sa.Column(sa.Unicode(100), index=True) |
|
93 | 88 | url_path = sa.Column(sa.Unicode(255), index=True) |
|
94 | 89 | tags = sa.Column(JSON(), nullable=False, default={}) |
|
95 | 90 | language = sa.Column(sa.Integer(), default=0) |
|
96 | 91 | # this is used to determine partition for the report |
|
97 | 92 | report_group_time = sa.Column(sa.DateTime(), default=datetime.utcnow, |
|
98 | 93 | server_default=sa.func.now()) |
|
99 | 94 | |
|
100 | 95 | logs = sa.orm.relationship( |
|
101 | 96 | 'Log', |
|
102 | 97 | lazy='dynamic', |
|
103 | 98 | passive_deletes=True, |
|
104 | 99 | passive_updates=True, |
|
105 | 100 | primaryjoin="and_(Report.request_id==Log.request_id, " |
|
106 | 101 | "Log.request_id != None, Log.request_id != '')", |
|
107 | 102 | foreign_keys='[Log.request_id]') |
|
108 | 103 | |
|
109 | 104 | slow_calls = sa.orm.relationship('SlowCall', |
|
110 | 105 | backref='detail', |
|
111 | 106 | cascade="all, delete-orphan", |
|
112 | 107 | passive_deletes=True, |
|
113 | 108 | passive_updates=True, |
|
114 | 109 | order_by='SlowCall.timestamp') |
|
115 | 110 | |
|
116 | 111 | def set_data(self, data, resource, protocol_version=None): |
|
117 | 112 | self.http_status = data['http_status'] |
|
118 | 113 | self.priority = data['priority'] |
|
119 | 114 | self.error = data['error'] |
|
120 | 115 | report_language = data.get('language', '').lower() |
|
121 | 116 | self.language = getattr(Language, report_language, Language.unknown) |
|
122 | 117 | # we need temp holder here to decide later |
|
123 | 118 | # if we want to to commit the tags if report is marked for creation |
|
124 | 119 | self.tags = { |
|
125 | 120 | 'server_name': data['server'], |
|
126 | 121 | 'view_name': data['view_name'] |
|
127 | 122 | } |
|
128 | 123 | if data.get('tags'): |
|
129 | 124 | for tag_tuple in data['tags']: |
|
130 | 125 | self.tags[tag_tuple[0]] = tag_tuple[1] |
|
131 | 126 | self.traceback = data['traceback'] |
|
132 | 127 | stripped_traceback = self.stripped_traceback() |
|
133 | 128 | tb_repr = repr(stripped_traceback).encode('utf8') |
|
134 | 129 | self.traceback_hash = hashlib.sha1(tb_repr).hexdigest() |
|
135 | 130 | url_info = urllib.parse.urlsplit( |
|
136 | 131 | data.get('url', ''), allow_fragments=False) |
|
137 | 132 | self.url_domain = url_info.netloc[:128] |
|
138 | 133 | self.url_path = url_info.path[:2048] |
|
139 | 134 | self.occurences = data['occurences'] |
|
140 | 135 | if self.error: |
|
141 | 136 | self.report_type = ReportType.error |
|
142 | 137 | else: |
|
143 | 138 | self.report_type = ReportType.slow |
|
144 | 139 | |
|
145 | 140 | # but if its status 404 its 404 type |
|
146 | 141 | if self.http_status in [404, '404'] or self.error == '404 Not Found': |
|
147 | 142 | self.report_type = ReportType.not_found |
|
148 | 143 | self.error = '' |
|
149 | 144 | |
|
150 | 145 | self.generate_grouping_hash(data.get('appenlight.group_string', |
|
151 | 146 | data.get('group_string')), |
|
152 | 147 | resource.default_grouping, |
|
153 | 148 | protocol_version) |
|
154 | 149 | |
|
155 | 150 | # details |
|
156 | 151 | if data['http_status'] in [404, '404']: |
|
157 | 152 | data = {"username": data["username"], |
|
158 | 153 | "ip": data["ip"], |
|
159 | 154 | "url": data["url"], |
|
160 | 155 | "user_agent": data["user_agent"]} |
|
161 | 156 | if data.get('HTTP_REFERER') or data.get('http_referer'): |
|
162 | 157 | data['HTTP_REFERER'] = data.get( |
|
163 | 158 | 'HTTP_REFERER', '') or data.get('http_referer', '') |
|
164 | 159 | |
|
165 | 160 | self.resource_id = resource.resource_id |
|
166 | 161 | self.username = data['username'] |
|
167 | 162 | self.user_agent = data['user_agent'] |
|
168 | 163 | self.ip = data['ip'] |
|
169 | 164 | self.extra = {} |
|
170 | 165 | if data.get('extra'): |
|
171 | 166 | for extra_tuple in data['extra']: |
|
172 | 167 | self.extra[extra_tuple[0]] = extra_tuple[1] |
|
173 | 168 | |
|
174 | 169 | self.url = data['url'] |
|
175 | 170 | self.request_id = data.get('request_id', '').replace('-', '') or str( |
|
176 | 171 | uuid.uuid4()) |
|
177 | 172 | request_data = data.get('request', {}) |
|
178 | 173 | |
|
179 | 174 | self.request = request_data |
|
180 | 175 | self.request_stats = data.get('request_stats', {}) |
|
181 | 176 | traceback = data.get('traceback') |
|
182 | 177 | if not traceback: |
|
183 | 178 | traceback = data.get('frameinfo') |
|
184 | 179 | self.traceback = traceback |
|
185 | 180 | start_date = convert_date(data.get('start_time')) |
|
186 | 181 | if not self.start_time or self.start_time < start_date: |
|
187 | 182 | self.start_time = start_date |
|
188 | 183 | |
|
189 | 184 | self.end_time = convert_date(data.get('end_time'), False) |
|
190 | 185 | self.duration = 0 |
|
191 | 186 | |
|
192 | 187 | if self.start_time and self.end_time: |
|
193 | 188 | d = self.end_time - self.start_time |
|
194 | 189 | self.duration = d.total_seconds() |
|
195 | 190 | |
|
196 | 191 | # update tags with other vars |
|
197 | 192 | if self.username: |
|
198 | 193 | self.tags['user_name'] = self.username |
|
199 | 194 | self.tags['report_language'] = Language.key_from_value(self.language) |
|
200 | 195 | |
|
201 | 196 | def add_slow_calls(self, data, report_group): |
|
202 | 197 | slow_calls = [] |
|
203 | 198 | for call in data.get('slow_calls', []): |
|
204 | 199 | sc_inst = SlowCall() |
|
205 | 200 | sc_inst.set_data(call, resource_id=self.resource_id, |
|
206 | 201 | report_group=report_group) |
|
207 | 202 | slow_calls.append(sc_inst) |
|
208 | 203 | self.slow_calls.extend(slow_calls) |
|
209 | 204 | return slow_calls |
|
210 | 205 | |
|
211 | 206 | def get_dict(self, request, details=False, exclude_keys=None, |
|
212 | 207 | include_keys=None): |
|
213 | 208 | from appenlight.models.services.report_group import ReportGroupService |
|
214 | 209 | instance_dict = super(Report, self).get_dict() |
|
215 | 210 | instance_dict['req_stats'] = self.req_stats() |
|
216 | 211 | instance_dict['group'] = {} |
|
217 | 212 | instance_dict['group']['id'] = self.report_group.id |
|
218 | 213 | instance_dict['group'][ |
|
219 | 214 | 'total_reports'] = self.report_group.total_reports |
|
220 | 215 | instance_dict['group']['last_report'] = self.report_group.last_report |
|
221 | 216 | instance_dict['group']['priority'] = self.report_group.priority |
|
222 | 217 | instance_dict['group']['occurences'] = self.report_group.occurences |
|
223 | 218 | instance_dict['group'][ |
|
224 | 219 | 'last_timestamp'] = self.report_group.last_timestamp |
|
225 | 220 | instance_dict['group'][ |
|
226 | 221 | 'first_timestamp'] = self.report_group.first_timestamp |
|
227 | 222 | instance_dict['group']['public'] = self.report_group.public |
|
228 | 223 | instance_dict['group']['fixed'] = self.report_group.fixed |
|
229 | 224 | instance_dict['group']['read'] = self.report_group.read |
|
230 | 225 | instance_dict['group'][ |
|
231 | 226 | 'average_duration'] = self.report_group.average_duration |
|
232 | 227 | |
|
233 | 228 | instance_dict[ |
|
234 | 229 | 'resource_name'] = self.report_group.application.resource_name |
|
235 | 230 | instance_dict['report_type'] = self.report_type |
|
236 | 231 | |
|
237 | 232 | if instance_dict['http_status'] == 404 and not instance_dict['error']: |
|
238 | 233 | instance_dict['error'] = '404 Not Found' |
|
239 | 234 | |
|
240 | 235 | if details: |
|
241 | 236 | instance_dict['affected_users_count'] = \ |
|
242 | 237 | ReportGroupService.affected_users_count(self.report_group) |
|
243 | 238 | instance_dict['top_affected_users'] = [ |
|
244 | 239 | {'username': u.username, 'count': u.count} for u in |
|
245 | 240 | ReportGroupService.top_affected_users(self.report_group)] |
|
246 | 241 | instance_dict['application'] = {'integrations': []} |
|
247 | 242 | for integration in self.report_group.application.integrations: |
|
248 | 243 | if integration.front_visible: |
|
249 | 244 | instance_dict['application']['integrations'].append( |
|
250 | 245 | {'name': integration.integration_name, |
|
251 | 246 | 'action': integration.integration_action}) |
|
252 | 247 | instance_dict['comments'] = [c.get_dict() for c in |
|
253 | 248 | self.report_group.comments] |
|
254 | 249 | |
|
255 | 250 | instance_dict['group']['next_report'] = None |
|
256 | 251 | instance_dict['group']['previous_report'] = None |
|
257 | 252 | next_in_group = self.get_next_in_group(request) |
|
258 | 253 | previous_in_group = self.get_previous_in_group(request) |
|
259 | 254 | if next_in_group: |
|
260 | 255 | instance_dict['group']['next_report'] = next_in_group |
|
261 | 256 | if previous_in_group: |
|
262 | 257 | instance_dict['group']['previous_report'] = previous_in_group |
|
263 | 258 | |
|
264 | 259 | # slow call ordering |
|
265 | 260 | def find_parent(row, data): |
|
266 | 261 | for r in reversed(data): |
|
267 | 262 | try: |
|
268 | 263 | if (row['timestamp'] > r['timestamp'] and |
|
269 | 264 | row['end_time'] < r['end_time']): |
|
270 | 265 | return r |
|
271 | 266 | except TypeError as e: |
|
272 | 267 | log.warning('reports_view.find_parent: %s' % e) |
|
273 | 268 | return None |
|
274 | 269 | |
|
275 | 270 | new_calls = [] |
|
276 | 271 | calls = [c.get_dict() for c in self.slow_calls] |
|
277 | 272 | while calls: |
|
278 | 273 | # start from end |
|
279 | 274 | for x in range(len(calls) - 1, -1, -1): |
|
280 | 275 | parent = find_parent(calls[x], calls) |
|
281 | 276 | if parent: |
|
282 | 277 | parent['children'].append(calls[x]) |
|
283 | 278 | else: |
|
284 | 279 | # no parent at all? append to new calls anyways |
|
285 | 280 | new_calls.append(calls[x]) |
|
286 | 281 | # print 'append', calls[x] |
|
287 | 282 | del calls[x] |
|
288 | 283 | break |
|
289 | 284 | instance_dict['slow_calls'] = new_calls |
|
290 | 285 | |
|
291 | 286 | instance_dict['front_url'] = self.get_public_url(request) |
|
292 | 287 | |
|
293 | 288 | exclude_keys_list = exclude_keys or [] |
|
294 | 289 | include_keys_list = include_keys or [] |
|
295 | 290 | for k in list(instance_dict.keys()): |
|
296 | 291 | if k == 'group': |
|
297 | 292 | continue |
|
298 | 293 | if (k in exclude_keys_list or |
|
299 | 294 | (k not in include_keys_list and include_keys)): |
|
300 | 295 | del instance_dict[k] |
|
301 | 296 | return instance_dict |
|
302 | 297 | |
|
303 | 298 | def get_previous_in_group(self, request): |
|
304 | 299 | query = { |
|
305 | 300 | "size": 1, |
|
306 | 301 | "query": { |
|
307 | 302 | "filtered": { |
|
308 | 303 | "filter": { |
|
309 | 304 | "and": [{"term": {"group_id": self.group_id}}, |
|
310 | 305 | {"range": {"pg_id": {"lt": self.id}}}] |
|
311 | 306 | } |
|
312 | 307 | } |
|
313 | 308 | }, |
|
314 | 309 | "sort": [ |
|
315 | 310 | {"_doc": {"order": "desc"}}, |
|
316 | 311 | ], |
|
317 | 312 | } |
|
318 | 313 | result = request.es_conn.search(query, index=self.partition_id, |
|
319 | 314 | doc_type='report') |
|
320 | 315 | if result['hits']['total']: |
|
321 | 316 | return result['hits']['hits'][0]['_source']['pg_id'] |
|
322 | 317 | |
|
323 | 318 | def get_next_in_group(self, request): |
|
324 | 319 | query = { |
|
325 | 320 | "size": 1, |
|
326 | 321 | "query": { |
|
327 | 322 | "filtered": { |
|
328 | 323 | "filter": { |
|
329 | 324 | "and": [{"term": {"group_id": self.group_id}}, |
|
330 | 325 | {"range": {"pg_id": {"gt": self.id}}}] |
|
331 | 326 | } |
|
332 | 327 | } |
|
333 | 328 | }, |
|
334 | 329 | "sort": [ |
|
335 | 330 | {"_doc": {"order": "asc"}}, |
|
336 | 331 | ], |
|
337 | 332 | } |
|
338 | 333 | result = request.es_conn.search(query, index=self.partition_id, |
|
339 | 334 | doc_type='report') |
|
340 | 335 | if result['hits']['total']: |
|
341 | 336 | return result['hits']['hits'][0]['_source']['pg_id'] |
|
342 | 337 | |
|
343 | 338 | def get_public_url(self, request=None, report_group=None, _app_url=None): |
|
344 | 339 | """ |
|
345 | 340 | Returns url that user can use to visit specific report |
|
346 | 341 | """ |
|
347 | 342 | if not request: |
|
348 | 343 | request = get_current_request() |
|
349 | 344 | url = request.route_url('/', _app_url=_app_url) |
|
350 | 345 | if report_group: |
|
351 | 346 | return (url + 'ui/report/%s/%s') % (report_group.id, self.id) |
|
352 | 347 | return (url + 'ui/report/%s/%s') % (self.group_id, self.id) |
|
353 | 348 | |
|
354 | 349 | def req_stats(self): |
|
355 | 350 | stats = self.request_stats.copy() |
|
356 | 351 | stats['percentages'] = {} |
|
357 | 352 | stats['percentages']['main'] = 100.0 |
|
358 | 353 | main = stats.get('main', 0.0) |
|
359 | 354 | if not main: |
|
360 | 355 | return None |
|
361 | 356 | for name, call_time in stats.items(): |
|
362 | 357 | if ('calls' not in name and 'main' not in name and |
|
363 | 358 | 'percentages' not in name): |
|
364 | 359 | stats['main'] -= call_time |
|
365 | 360 | stats['percentages'][name] = math.floor( |
|
366 | 361 | (call_time / main * 100.0)) |
|
367 | 362 | stats['percentages']['main'] -= stats['percentages'][name] |
|
368 | 363 | if stats['percentages']['main'] < 0.0: |
|
369 | 364 | stats['percentages']['main'] = 0.0 |
|
370 | 365 | stats['main'] = 0.0 |
|
371 | 366 | return stats |
|
372 | 367 | |
|
373 | 368 | def generate_grouping_hash(self, hash_string=None, default_grouping=None, |
|
374 | 369 | protocol_version=None): |
|
375 | 370 | """ |
|
376 | 371 | Generates SHA1 hash that will be used to group reports together |
|
377 | 372 | """ |
|
378 | 373 | if not hash_string: |
|
379 | 374 | location = self.tags.get('view_name') or self.url_path; |
|
380 | 375 | server_name = self.tags.get('server_name') or '' |
|
381 | 376 | if default_grouping == 'url_traceback': |
|
382 | 377 | hash_string = '%s_%s_%s' % (self.traceback_hash, location, |
|
383 | 378 | self.error) |
|
384 | 379 | if self.language == Language.javascript: |
|
385 | 380 | hash_string = '%s_%s' % (self.traceback_hash, self.error) |
|
386 | 381 | |
|
387 | 382 | elif default_grouping == 'traceback_server': |
|
388 | 383 | hash_string = '%s_%s' % (self.traceback_hash, server_name) |
|
389 | 384 | if self.language == Language.javascript: |
|
390 | 385 | hash_string = '%s_%s' % (self.traceback_hash, server_name) |
|
391 | 386 | else: |
|
392 | 387 | hash_string = '%s_%s' % (self.error, location) |
|
393 | 388 | month = datetime.utcnow().date().replace(day=1) |
|
394 | 389 | hash_string = '{}_{}'.format(month, hash_string) |
|
395 | 390 | binary_string = hash_string.encode('utf8') |
|
396 | 391 | self.grouping_hash = hashlib.sha1(binary_string).hexdigest() |
|
397 | 392 | return self.grouping_hash |
|
398 | 393 | |
|
399 | 394 | def stripped_traceback(self): |
|
400 | 395 | """ |
|
401 | 396 | Traceback without local vars |
|
402 | 397 | """ |
|
403 | 398 | stripped_traceback = copy.deepcopy(self.traceback) |
|
404 | 399 | |
|
405 | 400 | if isinstance(stripped_traceback, list): |
|
406 | 401 | for row in stripped_traceback: |
|
407 | 402 | row.pop('vars', None) |
|
408 | 403 | return stripped_traceback |
|
409 | 404 | |
|
410 | 405 | def notify_channel(self, report_group): |
|
411 | 406 | """ |
|
412 | 407 | Sends notification to websocket channel |
|
413 | 408 | """ |
|
414 | 409 | settings = get_current_registry().settings |
|
415 | 410 | log.info('notify channelstream') |
|
416 | 411 | if self.report_type != ReportType.error: |
|
417 | 412 | return |
|
418 | 413 | payload = { |
|
419 | 414 | 'type': 'message', |
|
420 | 415 | "user": '__system__', |
|
421 | 416 | "channel": 'app_%s' % self.resource_id, |
|
422 | 417 | 'message': { |
|
423 | 418 | 'topic': 'front_dashboard.new_topic', |
|
424 | 419 | 'report': { |
|
425 | 420 | 'group': { |
|
426 | 421 | 'priority': report_group.priority, |
|
427 | 422 | 'first_timestamp': report_group.first_timestamp, |
|
428 | 423 | 'last_timestamp': report_group.last_timestamp, |
|
429 | 424 | 'average_duration': report_group.average_duration, |
|
430 | 425 | 'occurences': report_group.occurences |
|
431 | 426 | }, |
|
432 | 427 | 'report_id': self.id, |
|
433 | 428 | 'group_id': self.group_id, |
|
434 | 429 | 'resource_id': self.resource_id, |
|
435 | 430 | 'http_status': self.http_status, |
|
436 | 431 | 'url_domain': self.url_domain, |
|
437 | 432 | 'url_path': self.url_path, |
|
438 | 433 | 'error': self.error or '', |
|
439 | 434 | 'server': self.tags.get('server_name'), |
|
440 | 435 | 'view_name': self.tags.get('view_name'), |
|
441 | 436 | 'front_url': self.get_public_url(), |
|
442 | 437 | } |
|
443 | 438 | } |
|
444 | 439 | |
|
445 | 440 | } |
|
446 | 441 | channelstream_request(settings['cometd.secret'], '/message', [payload], |
|
447 | 442 | servers=[settings['cometd_servers']]) |
|
448 | 443 | |
|
449 | 444 | def es_doc(self): |
|
450 | 445 | tags = {} |
|
451 | 446 | tag_list = [] |
|
452 | 447 | for name, value in self.tags.items(): |
|
453 | 448 | name = name.replace('.', '_') |
|
454 | 449 | tag_list.append(name) |
|
455 | 450 | tags[name] = { |
|
456 | 451 | "values": convert_es_type(value), |
|
457 | 452 | "numeric_values": value if ( |
|
458 | 453 | isinstance(value, (int, float)) and |
|
459 | 454 | not isinstance(value, bool)) else None} |
|
460 | 455 | |
|
461 | 456 | if 'user_name' not in self.tags and self.username: |
|
462 | 457 | tags["user_name"] = {"value": [self.username], |
|
463 | 458 | "numeric_value": None} |
|
464 | 459 | return { |
|
465 | 460 | '_id': str(self.id), |
|
466 | 461 | 'pg_id': str(self.id), |
|
467 | 462 | 'resource_id': self.resource_id, |
|
468 | 463 | 'http_status': self.http_status or '', |
|
469 | 464 | 'start_time': self.start_time, |
|
470 | 465 | 'end_time': self.end_time, |
|
471 | 466 | 'url_domain': self.url_domain if self.url_domain else '', |
|
472 | 467 | 'url_path': self.url_path if self.url_path else '', |
|
473 | 468 | 'duration': self.duration, |
|
474 | 469 | 'error': self.error if self.error else '', |
|
475 | 470 | 'report_type': self.report_type, |
|
476 | 471 | 'request_id': self.request_id, |
|
477 | 472 | 'ip': self.ip, |
|
478 | 473 | 'group_id': str(self.group_id), |
|
479 | 474 | '_parent': str(self.group_id), |
|
480 | 475 | 'tags': tags, |
|
481 | 476 | 'tag_list': tag_list |
|
482 | 477 | } |
|
483 | 478 | |
|
484 | 479 | @property |
|
485 | 480 | def partition_id(self): |
|
486 | 481 | return 'rcae_r_%s' % self.report_group_time.strftime('%Y_%m') |
|
487 | 482 | |
|
488 | 483 | def partition_range(self): |
|
489 | 484 | start_date = self.report_group_time.date().replace(day=1) |
|
490 | 485 | end_date = start_date + timedelta(days=40) |
|
491 | 486 | end_date = end_date.replace(day=1) |
|
492 | 487 | return start_date, end_date |
|
493 | 488 | |
|
494 | 489 | |
|
495 | 490 | def after_insert(mapper, connection, target): |
|
496 | 491 | if not hasattr(target, '_skip_ft_index'): |
|
497 | 492 | data = target.es_doc() |
|
498 | 493 | data.pop('_id', None) |
|
499 | 494 | Datastores.es.index(target.partition_id, 'report', data, |
|
500 | 495 | parent=target.group_id, id=target.id) |
|
501 | 496 | |
|
502 | 497 | |
|
503 | 498 | def after_update(mapper, connection, target): |
|
504 | 499 | if not hasattr(target, '_skip_ft_index'): |
|
505 | 500 | data = target.es_doc() |
|
506 | 501 | data.pop('_id', None) |
|
507 | 502 | Datastores.es.index(target.partition_id, 'report', data, |
|
508 | 503 | parent=target.group_id, id=target.id) |
|
509 | 504 | |
|
510 | 505 | |
|
511 | 506 | def after_delete(mapper, connection, target): |
|
512 | 507 | if not hasattr(target, '_skip_ft_index'): |
|
513 | 508 | query = {'term': {'pg_id': target.id}} |
|
514 | 509 | Datastores.es.delete_by_query(target.partition_id, 'report', query) |
|
515 | 510 | |
|
516 | 511 | |
|
517 | 512 | sa.event.listen(Report, 'after_insert', after_insert) |
|
518 | 513 | sa.event.listen(Report, 'after_update', after_update) |
|
519 | 514 | sa.event.listen(Report, 'after_delete', after_delete) |
@@ -1,37 +1,32 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | from ziggurat_foundations.models.base import BaseModel |
|
23 | 18 | from appenlight.models import Base |
|
24 | 19 | import sqlalchemy as sa |
|
25 | 20 | |
|
26 | 21 | |
|
27 | 22 | class ReportAssignment(Base, BaseModel): |
|
28 | 23 | __tablename__ = 'reports_assignments' |
|
29 | 24 | |
|
30 | 25 | group_id = sa.Column(sa.BigInteger, |
|
31 | 26 | sa.ForeignKey('reports_groups.id', ondelete='cascade', |
|
32 | 27 | onupdate='cascade'), |
|
33 | 28 | primary_key=True) |
|
34 | 29 | owner_id = sa.Column(sa.Integer, |
|
35 | 30 | sa.ForeignKey('users.id', onupdate='CASCADE', |
|
36 | 31 | ondelete='CASCADE'), primary_key=True) |
|
37 | 32 | report_time = sa.Column(sa.DateTime(), nullable=False) |
@@ -1,55 +1,50 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import sqlalchemy as sa |
|
23 | 18 | |
|
24 | 19 | from datetime import datetime |
|
25 | 20 | from appenlight.models import Base |
|
26 | 21 | from ziggurat_foundations.models.base import BaseModel |
|
27 | 22 | |
|
28 | 23 | |
|
29 | 24 | class ReportComment(Base, BaseModel): |
|
30 | 25 | __tablename__ = 'reports_comments' |
|
31 | 26 | |
|
32 | 27 | comment_id = sa.Column(sa.Integer, nullable=False, primary_key=True) |
|
33 | 28 | group_id = sa.Column(sa.BigInteger, |
|
34 | 29 | sa.ForeignKey('reports_groups.id', ondelete='cascade', |
|
35 | 30 | onupdate='cascade')) |
|
36 | 31 | body = sa.Column(sa.UnicodeText(), default='') |
|
37 | 32 | owner_id = sa.Column(sa.Integer, |
|
38 | 33 | sa.ForeignKey('users.id', onupdate='CASCADE', |
|
39 | 34 | ondelete='CASCADE')) |
|
40 | 35 | created_timestamp = sa.Column(sa.DateTime(), |
|
41 | 36 | default=datetime.utcnow, |
|
42 | 37 | server_default=sa.func.now()) |
|
43 | 38 | report_time = sa.Column(sa.DateTime(), nullable=False) |
|
44 | 39 | |
|
45 | 40 | owner = sa.orm.relationship('User', |
|
46 | 41 | lazy='joined') |
|
47 | 42 | |
|
48 | 43 | @property |
|
49 | 44 | def processed_body(self): |
|
50 | 45 | return self.body |
|
51 | 46 | |
|
52 | 47 | def get_dict(self): |
|
53 | 48 | instance_dict = super(ReportComment, self).get_dict() |
|
54 | 49 | instance_dict['user_name'] = self.owner.user_name |
|
55 | 50 | return instance_dict |
@@ -1,275 +1,270 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import logging |
|
23 | 18 | import sqlalchemy as sa |
|
24 | 19 | |
|
25 | 20 | from datetime import datetime, timedelta |
|
26 | 21 | |
|
27 | 22 | from pyramid.threadlocal import get_current_request |
|
28 | 23 | from sqlalchemy.dialects.postgresql import JSON |
|
29 | 24 | from ziggurat_foundations.models.base import BaseModel |
|
30 | 25 | |
|
31 | 26 | from appenlight.models import Base, get_db_session, Datastores |
|
32 | 27 | from appenlight.lib.enums import ReportType |
|
33 | 28 | from appenlight.lib.rule import Rule |
|
34 | 29 | from appenlight.lib.redis_keys import REDIS_KEYS |
|
35 | 30 | from appenlight.models.report import REPORT_TYPE_MATRIX |
|
36 | 31 | |
|
37 | 32 | log = logging.getLogger(__name__) |
|
38 | 33 | |
|
39 | 34 | |
|
40 | 35 | class ReportGroup(Base, BaseModel): |
|
41 | 36 | __tablename__ = 'reports_groups' |
|
42 | 37 | __table_args__ = {'implicit_returning': False} |
|
43 | 38 | |
|
44 | 39 | id = sa.Column(sa.BigInteger(), nullable=False, primary_key=True) |
|
45 | 40 | resource_id = sa.Column(sa.Integer(), |
|
46 | 41 | sa.ForeignKey('applications.resource_id', |
|
47 | 42 | onupdate='CASCADE', |
|
48 | 43 | ondelete='CASCADE'), |
|
49 | 44 | nullable=False, |
|
50 | 45 | index=True) |
|
51 | 46 | priority = sa.Column(sa.Integer, nullable=False, index=True, default=5, |
|
52 | 47 | server_default='5') |
|
53 | 48 | first_timestamp = sa.Column(sa.DateTime(), default=datetime.utcnow, |
|
54 | 49 | server_default=sa.func.now()) |
|
55 | 50 | last_timestamp = sa.Column(sa.DateTime(), default=datetime.utcnow, |
|
56 | 51 | server_default=sa.func.now()) |
|
57 | 52 | error = sa.Column(sa.UnicodeText(), index=True) |
|
58 | 53 | grouping_hash = sa.Column(sa.String(40), default='') |
|
59 | 54 | triggered_postprocesses_ids = sa.Column(JSON(), nullable=False, |
|
60 | 55 | default=list) |
|
61 | 56 | report_type = sa.Column(sa.Integer, default=1) |
|
62 | 57 | total_reports = sa.Column(sa.Integer, default=1) |
|
63 | 58 | last_report = sa.Column(sa.Integer) |
|
64 | 59 | occurences = sa.Column(sa.Integer, default=1) |
|
65 | 60 | average_duration = sa.Column(sa.Float, default=0) |
|
66 | 61 | summed_duration = sa.Column(sa.Float, default=0) |
|
67 | 62 | read = sa.Column(sa.Boolean(), index=True, default=False) |
|
68 | 63 | fixed = sa.Column(sa.Boolean(), index=True, default=False) |
|
69 | 64 | notified = sa.Column(sa.Boolean(), index=True, default=False) |
|
70 | 65 | public = sa.Column(sa.Boolean(), index=True, default=False) |
|
71 | 66 | |
|
72 | 67 | reports = sa.orm.relationship('Report', |
|
73 | 68 | lazy='dynamic', |
|
74 | 69 | backref='report_group', |
|
75 | 70 | cascade="all, delete-orphan", |
|
76 | 71 | passive_deletes=True, |
|
77 | 72 | passive_updates=True, ) |
|
78 | 73 | |
|
79 | 74 | comments = sa.orm.relationship('ReportComment', |
|
80 | 75 | lazy='dynamic', |
|
81 | 76 | backref='report', |
|
82 | 77 | cascade="all, delete-orphan", |
|
83 | 78 | passive_deletes=True, |
|
84 | 79 | passive_updates=True, |
|
85 | 80 | order_by="ReportComment.comment_id") |
|
86 | 81 | |
|
87 | 82 | assigned_users = sa.orm.relationship('User', |
|
88 | 83 | backref=sa.orm.backref( |
|
89 | 84 | 'assigned_reports_relation', |
|
90 | 85 | lazy='dynamic', |
|
91 | 86 | order_by=sa.desc( |
|
92 | 87 | "reports_groups.id") |
|
93 | 88 | ), |
|
94 | 89 | passive_deletes=True, |
|
95 | 90 | passive_updates=True, |
|
96 | 91 | secondary='reports_assignments', |
|
97 | 92 | order_by="User.user_name") |
|
98 | 93 | |
|
99 | 94 | stats = sa.orm.relationship('ReportStat', |
|
100 | 95 | lazy='dynamic', |
|
101 | 96 | backref='report', |
|
102 | 97 | passive_deletes=True, |
|
103 | 98 | passive_updates=True, ) |
|
104 | 99 | |
|
105 | 100 | last_report_ref = sa.orm.relationship('Report', |
|
106 | 101 | uselist=False, |
|
107 | 102 | primaryjoin="ReportGroup.last_report " |
|
108 | 103 | "== Report.id", |
|
109 | 104 | foreign_keys="Report.id", |
|
110 | 105 | cascade="all, delete-orphan", |
|
111 | 106 | passive_deletes=True, |
|
112 | 107 | passive_updates=True, ) |
|
113 | 108 | |
|
114 | 109 | def __repr__(self): |
|
115 | 110 | return '<ReportGroup id:{}>'.format(self.id) |
|
116 | 111 | |
|
117 | 112 | def get_report(self, report_id=None, public=False): |
|
118 | 113 | """ |
|
119 | 114 | Gets report with specific id or latest report if id was not specified |
|
120 | 115 | """ |
|
121 | 116 | from .report import Report |
|
122 | 117 | |
|
123 | 118 | if not report_id: |
|
124 | 119 | return self.last_report_ref |
|
125 | 120 | else: |
|
126 | 121 | return self.reports.filter(Report.id == report_id).first() |
|
127 | 122 | |
|
128 | 123 | def get_public_url(self, request, _app_url=None): |
|
129 | 124 | url = request.route_url('/', _app_url=_app_url) |
|
130 | 125 | return (url + 'ui/report/%s') % self.id |
|
131 | 126 | |
|
132 | 127 | def run_postprocessing(self, report): |
|
133 | 128 | """ |
|
134 | 129 | Alters report group priority based on postprocessing configuration |
|
135 | 130 | """ |
|
136 | 131 | request = get_current_request() |
|
137 | 132 | get_db_session(None, self).flush() |
|
138 | 133 | for action in self.application.postprocess_conf: |
|
139 | 134 | get_db_session(None, self).flush() |
|
140 | 135 | rule_obj = Rule(action.rule, REPORT_TYPE_MATRIX) |
|
141 | 136 | report_dict = report.get_dict(request) |
|
142 | 137 | # if was not processed yet |
|
143 | 138 | if (rule_obj.match(report_dict) and |
|
144 | 139 | action.pkey not in self.triggered_postprocesses_ids): |
|
145 | 140 | action.postprocess(self) |
|
146 | 141 | # this way sqla can track mutation of list |
|
147 | 142 | self.triggered_postprocesses_ids = \ |
|
148 | 143 | self.triggered_postprocesses_ids + [action.pkey] |
|
149 | 144 | |
|
150 | 145 | get_db_session(None, self).flush() |
|
151 | 146 | # do not go out of bounds |
|
152 | 147 | if self.priority < 1: |
|
153 | 148 | self.priority = 1 |
|
154 | 149 | if self.priority > 10: |
|
155 | 150 | self.priority = 10 |
|
156 | 151 | |
|
157 | 152 | def get_dict(self, request): |
|
158 | 153 | instance_dict = super(ReportGroup, self).get_dict() |
|
159 | 154 | instance_dict['server_name'] = self.get_report().tags.get( |
|
160 | 155 | 'server_name') |
|
161 | 156 | instance_dict['view_name'] = self.get_report().tags.get('view_name') |
|
162 | 157 | instance_dict['resource_name'] = self.application.resource_name |
|
163 | 158 | instance_dict['report_type'] = self.get_report().report_type |
|
164 | 159 | instance_dict['url_path'] = self.get_report().url_path |
|
165 | 160 | instance_dict['front_url'] = self.get_report().get_public_url(request) |
|
166 | 161 | del instance_dict['triggered_postprocesses_ids'] |
|
167 | 162 | return instance_dict |
|
168 | 163 | |
|
169 | 164 | def es_doc(self): |
|
170 | 165 | return { |
|
171 | 166 | '_id': str(self.id), |
|
172 | 167 | 'pg_id': str(self.id), |
|
173 | 168 | 'resource_id': self.resource_id, |
|
174 | 169 | 'error': self.error, |
|
175 | 170 | 'fixed': self.fixed, |
|
176 | 171 | 'public': self.public, |
|
177 | 172 | 'read': self.read, |
|
178 | 173 | 'priority': self.priority, |
|
179 | 174 | 'occurences': self.occurences, |
|
180 | 175 | 'average_duration': self.average_duration, |
|
181 | 176 | 'summed_duration': self.summed_duration, |
|
182 | 177 | 'first_timestamp': self.first_timestamp, |
|
183 | 178 | 'last_timestamp': self.last_timestamp |
|
184 | 179 | } |
|
185 | 180 | |
|
186 | 181 | def set_notification_info(self, notify_10=False, notify_100=False): |
|
187 | 182 | """ |
|
188 | 183 | Update redis notification maps for notification job |
|
189 | 184 | """ |
|
190 | 185 | current_time = datetime.utcnow().replace(second=0, microsecond=0) |
|
191 | 186 | # global app counter |
|
192 | 187 | key = REDIS_KEYS['counters']['reports_per_type'].format( |
|
193 | 188 | self.report_type, current_time) |
|
194 | 189 | redis_pipeline = Datastores.redis.pipeline() |
|
195 | 190 | redis_pipeline.incr(key) |
|
196 | 191 | redis_pipeline.expire(key, 3600 * 24) |
|
197 | 192 | # detailed app notification for alerts and notifications |
|
198 | 193 | redis_pipeline.sadd( |
|
199 | 194 | REDIS_KEYS['apps_that_had_reports'], self.resource_id) |
|
200 | 195 | redis_pipeline.sadd( |
|
201 | 196 | REDIS_KEYS['apps_that_had_reports_alerting'], self.resource_id) |
|
202 | 197 | # only notify for exceptions here |
|
203 | 198 | if self.report_type == ReportType.error: |
|
204 | 199 | redis_pipeline.sadd( |
|
205 | 200 | REDIS_KEYS['apps_that_had_reports'], self.resource_id) |
|
206 | 201 | redis_pipeline.sadd( |
|
207 | 202 | REDIS_KEYS['apps_that_had_error_reports_alerting'], |
|
208 | 203 | self.resource_id) |
|
209 | 204 | key = REDIS_KEYS['counters']['report_group_occurences'].format(self.id) |
|
210 | 205 | redis_pipeline.incr(key) |
|
211 | 206 | redis_pipeline.expire(key, 3600 * 24) |
|
212 | 207 | key = REDIS_KEYS['counters']['report_group_occurences_alerting'].format( |
|
213 | 208 | self.id) |
|
214 | 209 | redis_pipeline.incr(key) |
|
215 | 210 | redis_pipeline.expire(key, 3600 * 24) |
|
216 | 211 | |
|
217 | 212 | if notify_10: |
|
218 | 213 | key = REDIS_KEYS['counters'][ |
|
219 | 214 | 'report_group_occurences_10th'].format(self.id) |
|
220 | 215 | redis_pipeline.setex(key, 3600 * 24, 1) |
|
221 | 216 | if notify_100: |
|
222 | 217 | key = REDIS_KEYS['counters'][ |
|
223 | 218 | 'report_group_occurences_100th'].format(self.id) |
|
224 | 219 | redis_pipeline.setex(key, 3600 * 24, 1) |
|
225 | 220 | |
|
226 | 221 | key = REDIS_KEYS['reports_to_notify_per_type_per_app'].format( |
|
227 | 222 | self.report_type, self.resource_id) |
|
228 | 223 | redis_pipeline.sadd(key, self.id) |
|
229 | 224 | redis_pipeline.expire(key, 3600 * 24) |
|
230 | 225 | key = REDIS_KEYS['reports_to_notify_per_type_per_app_alerting'].format( |
|
231 | 226 | self.report_type, self.resource_id) |
|
232 | 227 | redis_pipeline.sadd(key, self.id) |
|
233 | 228 | redis_pipeline.expire(key, 3600 * 24) |
|
234 | 229 | redis_pipeline.execute() |
|
235 | 230 | |
|
236 | 231 | @property |
|
237 | 232 | def partition_id(self): |
|
238 | 233 | return 'rcae_r_%s' % self.first_timestamp.strftime('%Y_%m') |
|
239 | 234 | |
|
240 | 235 | def partition_range(self): |
|
241 | 236 | start_date = self.first_timestamp.date().replace(day=1) |
|
242 | 237 | end_date = start_date + timedelta(days=40) |
|
243 | 238 | end_date = end_date.replace(day=1) |
|
244 | 239 | return start_date, end_date |
|
245 | 240 | |
|
246 | 241 | |
|
247 | 242 | def after_insert(mapper, connection, target): |
|
248 | 243 | if not hasattr(target, '_skip_ft_index'): |
|
249 | 244 | data = target.es_doc() |
|
250 | 245 | data.pop('_id', None) |
|
251 | 246 | Datastores.es.index(target.partition_id, 'report_group', |
|
252 | 247 | data, id=target.id) |
|
253 | 248 | |
|
254 | 249 | |
|
255 | 250 | def after_update(mapper, connection, target): |
|
256 | 251 | if not hasattr(target, '_skip_ft_index'): |
|
257 | 252 | data = target.es_doc() |
|
258 | 253 | data.pop('_id', None) |
|
259 | 254 | Datastores.es.index(target.partition_id, 'report_group', |
|
260 | 255 | data, id=target.id) |
|
261 | 256 | |
|
262 | 257 | |
|
263 | 258 | def after_delete(mapper, connection, target): |
|
264 | 259 | query = {'term': {'group_id': target.id}} |
|
265 | 260 | # TODO: routing seems unnecessary, need to test a bit more |
|
266 | 261 | # Datastores.es.delete_by_query(target.partition_id, 'report', query, |
|
267 | 262 | # query_params={'routing':str(target.id)}) |
|
268 | 263 | Datastores.es.delete_by_query(target.partition_id, 'report', query) |
|
269 | 264 | query = {'term': {'pg_id': target.id}} |
|
270 | 265 | Datastores.es.delete_by_query(target.partition_id, 'report_group', query) |
|
271 | 266 | |
|
272 | 267 | |
|
273 | 268 | sa.event.listen(ReportGroup, 'after_insert', after_insert) |
|
274 | 269 | sa.event.listen(ReportGroup, 'after_update', after_update) |
|
275 | 270 | sa.event.listen(ReportGroup, 'after_delete', after_delete) |
@@ -1,79 +1,74 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import sqlalchemy as sa |
|
23 | 18 | |
|
24 | 19 | from appenlight.lib.enums import ReportType |
|
25 | 20 | from appenlight.models import Base |
|
26 | 21 | from ziggurat_foundations.models.base import BaseModel |
|
27 | 22 | |
|
28 | 23 | |
|
29 | 24 | class ReportStat(Base, BaseModel): |
|
30 | 25 | __tablename__ = 'reports_stats' |
|
31 | 26 | __table_args__ = {'implicit_returning': False} |
|
32 | 27 | |
|
33 | 28 | group_id = sa.Column(sa.BigInteger(), |
|
34 | 29 | sa.ForeignKey('reports_groups.id'), |
|
35 | 30 | nullable=False) |
|
36 | 31 | resource_id = sa.Column(sa.Integer(), |
|
37 | 32 | sa.ForeignKey('applications.resource_id'), |
|
38 | 33 | nullable=False) |
|
39 | 34 | start_interval = sa.Column(sa.DateTime(), nullable=False) |
|
40 | 35 | occurences = sa.Column(sa.Integer, nullable=True, default=0) |
|
41 | 36 | owner_user_id = sa.Column(sa.Integer(), sa.ForeignKey('users.id'), |
|
42 | 37 | nullable=True) |
|
43 | 38 | type = sa.Column(sa.Integer, nullable=True, default=0) |
|
44 | 39 | duration = sa.Column(sa.Float, nullable=True, default=0) |
|
45 | 40 | id = sa.Column(sa.BigInteger, nullable=False, primary_key=True) |
|
46 | 41 | server_name = sa.Column(sa.Unicode(128), nullable=False, default='') |
|
47 | 42 | view_name = sa.Column(sa.Unicode(128), nullable=False, default='') |
|
48 | 43 | |
|
49 | 44 | @property |
|
50 | 45 | def partition_id(self): |
|
51 | 46 | return 'rcae_r_%s' % self.start_interval.strftime('%Y_%m') |
|
52 | 47 | |
|
53 | 48 | def es_doc(self): |
|
54 | 49 | return { |
|
55 | 50 | 'resource_id': self.resource_id, |
|
56 | 51 | 'timestamp': self.start_interval, |
|
57 | 52 | 'pg_id': str(self.id), |
|
58 | 53 | 'permanent': True, |
|
59 | 54 | 'request_id': None, |
|
60 | 55 | 'log_level': 'ERROR', |
|
61 | 56 | 'message': None, |
|
62 | 57 | 'namespace': 'appenlight.error', |
|
63 | 58 | 'tags': { |
|
64 | 59 | 'duration': {'values': self.duration, |
|
65 | 60 | 'numeric_values': self.duration}, |
|
66 | 61 | 'occurences': {'values': self.occurences, |
|
67 | 62 | 'numeric_values': self.occurences}, |
|
68 | 63 | 'group_id': {'values': self.group_id, |
|
69 | 64 | 'numeric_values': self.group_id}, |
|
70 | 65 | 'type': {'values': ReportType.key_from_value(self.type), |
|
71 | 66 | 'numeric_values': self.type}, |
|
72 | 67 | 'server_name': {'values': self.server_name, |
|
73 | 68 | 'numeric_values': None}, |
|
74 | 69 | 'view_name': {'values': self.view_name, |
|
75 | 70 | 'numeric_values': None}, |
|
76 | 71 | }, |
|
77 | 72 | 'tag_list': ['duration', 'occurences', 'group_id', 'type', |
|
78 | 73 | 'server_name', 'view_name'] |
|
79 | 74 | } |
@@ -1,88 +1,83 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | import sqlalchemy as sa |
|
23 | 18 | from appenlight.models import Base |
|
24 | 19 | from appenlight.lib.utils import permission_tuple_to_dict |
|
25 | 20 | from pyramid.security import Allow, ALL_PERMISSIONS |
|
26 | 21 | from ziggurat_foundations.models.resource import ResourceMixin |
|
27 | 22 | |
|
28 | 23 | |
|
29 | 24 | class Resource(ResourceMixin, Base): |
|
30 | 25 | events = sa.orm.relationship('Event', |
|
31 | 26 | lazy='dynamic', |
|
32 | 27 | backref='resource', |
|
33 | 28 | passive_deletes=True, |
|
34 | 29 | passive_updates=True) |
|
35 | 30 | |
|
36 | 31 | @property |
|
37 | 32 | def owner_user_name(self): |
|
38 | 33 | if self.owner: |
|
39 | 34 | return self.owner.user_name |
|
40 | 35 | |
|
41 | 36 | @property |
|
42 | 37 | def owner_group_name(self): |
|
43 | 38 | if self.owner_group: |
|
44 | 39 | return self.owner_group.group_name |
|
45 | 40 | |
|
46 | 41 | def get_dict(self, exclude_keys=None, include_keys=None, |
|
47 | 42 | include_perms=False, include_processing_rules=False): |
|
48 | 43 | result = super(Resource, self).get_dict(exclude_keys, include_keys) |
|
49 | 44 | result['possible_permissions'] = self.__possible_permissions__ |
|
50 | 45 | if include_perms: |
|
51 | 46 | result['current_permissions'] = self.user_permissions_list |
|
52 | 47 | else: |
|
53 | 48 | result['current_permissions'] = [] |
|
54 | 49 | if include_processing_rules: |
|
55 | 50 | result["postprocessing_rules"] = [rule.get_dict() for rule |
|
56 | 51 | in self.postprocess_conf] |
|
57 | 52 | else: |
|
58 | 53 | result["postprocessing_rules"] = [] |
|
59 | 54 | exclude_keys_list = exclude_keys or [] |
|
60 | 55 | include_keys_list = include_keys or [] |
|
61 | 56 | d = {} |
|
62 | 57 | for k in result.keys(): |
|
63 | 58 | if (k not in exclude_keys_list and |
|
64 | 59 | (k in include_keys_list or not include_keys)): |
|
65 | 60 | d[k] = result[k] |
|
66 | 61 | for k in ['owner_user_name', 'owner_group_name']: |
|
67 | 62 | if (k not in exclude_keys_list and |
|
68 | 63 | (k in include_keys_list or not include_keys)): |
|
69 | 64 | d[k] = getattr(self, k) |
|
70 | 65 | return d |
|
71 | 66 | |
|
72 | 67 | @property |
|
73 | 68 | def user_permissions_list(self): |
|
74 | 69 | return [permission_tuple_to_dict(perm) for perm in |
|
75 | 70 | self.users_for_perm('__any_permission__', |
|
76 | 71 | limit_group_permissions=True)] |
|
77 | 72 | |
|
78 | 73 | @property |
|
79 | 74 | def __acl__(self): |
|
80 | 75 | acls = [] |
|
81 | 76 | |
|
82 | 77 | if self.owner_user_id: |
|
83 | 78 | acls.extend([(Allow, self.owner_user_id, ALL_PERMISSIONS,), ]) |
|
84 | 79 | |
|
85 | 80 | if self.owner_group_id: |
|
86 | 81 | acls.extend([(Allow, "group:%s" % self.owner_group_id, |
|
87 | 82 | ALL_PERMISSIONS,), ]) |
|
88 | 83 | return acls |
@@ -1,21 +1,16 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 |
@@ -1,40 +1,35 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | from appenlight.models import get_db_session |
|
23 | 18 | from appenlight.models.alert_channel import AlertChannel |
|
24 | 19 | from appenlight.models.services.base import BaseService |
|
25 | 20 | |
|
26 | 21 | |
|
27 | 22 | class AlertChannelService(BaseService): |
|
28 | 23 | @classmethod |
|
29 | 24 | def by_owner_id_and_pkey(cls, owner_id, pkey, db_session=None): |
|
30 | 25 | db_session = get_db_session(db_session) |
|
31 | 26 | query = db_session.query(AlertChannel) |
|
32 | 27 | query = query.filter(AlertChannel.owner_id == owner_id) |
|
33 | 28 | return query.filter(AlertChannel.pkey == pkey).first() |
|
34 | 29 | |
|
35 | 30 | @classmethod |
|
36 | 31 | def by_integration_id(cls, integration_id, db_session=None): |
|
37 | 32 | db_session = get_db_session(db_session) |
|
38 | 33 | query = db_session.query(AlertChannel) |
|
39 | 34 | query = query.filter(AlertChannel.integration_id == integration_id) |
|
40 | 35 | return query.first() |
@@ -1,64 +1,59 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
3 | # Copyright 2010 - 2017 RhodeCode GmbH and the AppEnlight project authors | |
|
4 | 4 | # |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
5 | # Licensed under the Apache License, Version 2.0 (the "License"); | |
|
6 | # you may not use this file except in compliance with the License. | |
|
7 | # You may obtain a copy of the License at | |
|
8 | 8 | # |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
9 | # http://www.apache.org/licenses/LICENSE-2.0 | |
|
13 | 10 | # |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # AppEnlight Enterprise Edition, including its added features, Support | |
|
19 | # services, and proprietary license terms, please see | |
|
20 | # https://rhodecode.com/licenses/ | |
|
11 | # Unless required by applicable law or agreed to in writing, software | |
|
12 | # distributed under the License is distributed on an "AS IS" BASIS, | |
|
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
14 | # See the License for the specific language governing permissions and | |
|
15 | # limitations under the License. | |
|
21 | 16 | |
|
22 | 17 | from appenlight.models import get_db_session |
|
23 | 18 | from appenlight.models.alert_channel_action import AlertChannelAction |
|
24 | 19 | from appenlight.models.services.base import BaseService |
|
25 | 20 | |
|
26 | 21 | |
|
27 | 22 | class AlertChannelActionService(BaseService): |
|
28 | 23 | @classmethod |
|
29 | 24 | def by_owner_id_and_pkey(cls, owner_id, pkey, db_session=None): |
|
30 | 25 | db_session = get_db_session(db_session) |
|
31 | 26 | query = db_session.query(AlertChannelAction) |
|
32 | 27 | query = query.filter(AlertChannelAction.owner_id == owner_id) |
|
33 | 28 | return query.filter(AlertChannelAction.pkey == pkey).first() |
|
34 | 29 | |
|
35 | 30 | @classmethod |
|
36 | 31 | def by_pkey(cls, pkey, db_session=None): |
|
37 | 32 | db_session = get_db_session(db_session) |
|
38 | 33 | query = db_session.query(AlertChannelAction) |
|
39 | 34 | return query.filter(AlertChannelAction.pkey == pkey).first() |
|
40 | 35 | |
|
41 | 36 | @classmethod |
|
42 | 37 | def by_owner_id_and_type(cls, owner_id, alert_type, db_session=None): |
|
43 | 38 | db_session = get_db_session(db_session) |
|
44 | 39 | query = db_session.query(AlertChannelAction) |
|
45 | 40 | query = query.filter(AlertChannelAction.owner_id == owner_id) |
|
46 | 41 | return query.filter(AlertChannelAction.type == alert_type).first() |
|
47 | 42 | |
|
48 | 43 | @classmethod |
|
49 | 44 | def by_type(cls, alert_type, db_session=None): |
|
50 | 45 | db_session = get_db_session(db_session) |
|
51 | 46 | query = db_session.query(AlertChannelAction) |
|
52 | 47 | return query.filter(AlertChannelAction.type == alert_type) |
|
53 | 48 | |
|
54 | 49 | @classmethod |
|
55 | 50 | def by_other_id(cls, other_id, db_session=None): |
|
56 | 51 | db_session = get_db_session(db_session) |
|
57 | 52 | query = db_session.query(AlertChannelAction) |
|
58 | 53 | return query.filter(AlertChannelAction.other_id == other_id) |
|
59 | 54 | |
|
60 | 55 | @classmethod |
|
61 | 56 | def by_resource_id(cls, resource_id, db_session=None): |
|
62 | 57 | db_session = get_db_session(db_session) |
|
63 | 58 | query = db_session.query(AlertChannelAction) |
|
64 | 59 | return query.filter(AlertChannelAction.resource_id == resource_id) |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
General Comments 0
You need to be logged in to leave comments.
Login now