Compare commits
29 Commits
e792b86485
...
v0.2.1
| Author | SHA1 | Date | |
|---|---|---|---|
| 899db3421b | |||
| e3509e997f | |||
| 1c30200db0 | |||
| 7ff422d4dc | |||
| 546d51af9a | |||
| 0d1fe05fd0 | |||
| c5d4b2f1cd | |||
| caf01d6ada | |||
| a5d19e2982 | |||
| 692e7e3a6e | |||
| 78dba93ee0 | |||
| 93a5aa6618 | |||
| 9ab750650c | |||
| 609e9db2f7 | |||
| 94a55cf2b7 | |||
| b9cfc45aa2 | |||
| 2d60e36fbf | |||
| c78f7fa6b0 | |||
| b3dce8d13e | |||
| bb366cb4cd | |||
| a2745ff2ee | |||
| 28cb656d94 | |||
| 3c44152fc6 | |||
| 397515edce | |||
| 980fced7e4 | |||
| bae5009ec4 | |||
| 233780617f | |||
| fd8fb21517 | |||
| c6cbe822e1 |
661
LICENSE
Normal file
661
LICENSE
Normal file
@@ -0,0 +1,661 @@
|
|||||||
|
GNU AFFERO GENERAL PUBLIC LICENSE
|
||||||
|
Version 3, 19 November 2007
|
||||||
|
|
||||||
|
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
|
||||||
|
Everyone is permitted to copy and distribute verbatim copies
|
||||||
|
of this license document, but changing it is not allowed.
|
||||||
|
|
||||||
|
Preamble
|
||||||
|
|
||||||
|
The GNU Affero General Public License is a free, copyleft license for
|
||||||
|
software and other kinds of works, specifically designed to ensure
|
||||||
|
cooperation with the community in the case of network server software.
|
||||||
|
|
||||||
|
The licenses for most software and other practical works are designed
|
||||||
|
to take away your freedom to share and change the works. By contrast,
|
||||||
|
our General Public Licenses are intended to guarantee your freedom to
|
||||||
|
share and change all versions of a program--to make sure it remains free
|
||||||
|
software for all its users.
|
||||||
|
|
||||||
|
When we speak of free software, we are referring to freedom, not
|
||||||
|
price. Our General Public Licenses are designed to make sure that you
|
||||||
|
have the freedom to distribute copies of free software (and charge for
|
||||||
|
them if you wish), that you receive source code or can get it if you
|
||||||
|
want it, that you can change the software or use pieces of it in new
|
||||||
|
free programs, and that you know you can do these things.
|
||||||
|
|
||||||
|
Developers that use our General Public Licenses protect your rights
|
||||||
|
with two steps: (1) assert copyright on the software, and (2) offer
|
||||||
|
you this License which gives you legal permission to copy, distribute
|
||||||
|
and/or modify the software.
|
||||||
|
|
||||||
|
A secondary benefit of defending all users' freedom is that
|
||||||
|
improvements made in alternate versions of the program, if they
|
||||||
|
receive widespread use, become available for other developers to
|
||||||
|
incorporate. Many developers of free software are heartened and
|
||||||
|
encouraged by the resulting cooperation. However, in the case of
|
||||||
|
software used on network servers, this result may fail to come about.
|
||||||
|
The GNU General Public License permits making a modified version and
|
||||||
|
letting the public access it on a server without ever releasing its
|
||||||
|
source code to the public.
|
||||||
|
|
||||||
|
The GNU Affero General Public License is designed specifically to
|
||||||
|
ensure that, in such cases, the modified source code becomes available
|
||||||
|
to the community. It requires the operator of a network server to
|
||||||
|
provide the source code of the modified version running there to the
|
||||||
|
users of that server. Therefore, public use of a modified version, on
|
||||||
|
a publicly accessible server, gives the public access to the source
|
||||||
|
code of the modified version.
|
||||||
|
|
||||||
|
An older license, called the Affero General Public License and
|
||||||
|
published by Affero, was designed to accomplish similar goals. This is
|
||||||
|
a different license, not a version of the Affero GPL, but Affero has
|
||||||
|
released a new version of the Affero GPL which permits relicensing under
|
||||||
|
this license.
|
||||||
|
|
||||||
|
The precise terms and conditions for copying, distribution and
|
||||||
|
modification follow.
|
||||||
|
|
||||||
|
TERMS AND CONDITIONS
|
||||||
|
|
||||||
|
0. Definitions.
|
||||||
|
|
||||||
|
"This License" refers to version 3 of the GNU Affero General Public License.
|
||||||
|
|
||||||
|
"Copyright" also means copyright-like laws that apply to other kinds of
|
||||||
|
works, such as semiconductor masks.
|
||||||
|
|
||||||
|
"The Program" refers to any copyrightable work licensed under this
|
||||||
|
License. Each licensee is addressed as "you". "Licensees" and
|
||||||
|
"recipients" may be individuals or organizations.
|
||||||
|
|
||||||
|
To "modify" a work means to copy from or adapt all or part of the work
|
||||||
|
in a fashion requiring copyright permission, other than the making of an
|
||||||
|
exact copy. The resulting work is called a "modified version" of the
|
||||||
|
earlier work or a work "based on" the earlier work.
|
||||||
|
|
||||||
|
A "covered work" means either the unmodified Program or a work based
|
||||||
|
on the Program.
|
||||||
|
|
||||||
|
To "propagate" a work means to do anything with it that, without
|
||||||
|
permission, would make you directly or secondarily liable for
|
||||||
|
infringement under applicable copyright law, except executing it on a
|
||||||
|
computer or modifying a private copy. Propagation includes copying,
|
||||||
|
distribution (with or without modification), making available to the
|
||||||
|
public, and in some countries other activities as well.
|
||||||
|
|
||||||
|
To "convey" a work means any kind of propagation that enables other
|
||||||
|
parties to make or receive copies. Mere interaction with a user through
|
||||||
|
a computer network, with no transfer of a copy, is not conveying.
|
||||||
|
|
||||||
|
An interactive user interface displays "Appropriate Legal Notices"
|
||||||
|
to the extent that it includes a convenient and prominently visible
|
||||||
|
feature that (1) displays an appropriate copyright notice, and (2)
|
||||||
|
tells the user that there is no warranty for the work (except to the
|
||||||
|
extent that warranties are provided), that licensees may convey the
|
||||||
|
work under this License, and how to view a copy of this License. If
|
||||||
|
the interface presents a list of user commands or options, such as a
|
||||||
|
menu, a prominent item in the list meets this criterion.
|
||||||
|
|
||||||
|
1. Source Code.
|
||||||
|
|
||||||
|
The "source code" for a work means the preferred form of the work
|
||||||
|
for making modifications to it. "Object code" means any non-source
|
||||||
|
form of a work.
|
||||||
|
|
||||||
|
A "Standard Interface" means an interface that either is an official
|
||||||
|
standard defined by a recognized standards body, or, in the case of
|
||||||
|
interfaces specified for a particular programming language, one that
|
||||||
|
is widely used among developers working in that language.
|
||||||
|
|
||||||
|
The "System Libraries" of an executable work include anything, other
|
||||||
|
than the work as a whole, that (a) is included in the normal form of
|
||||||
|
packaging a Major Component, but which is not part of that Major
|
||||||
|
Component, and (b) serves only to enable use of the work with that
|
||||||
|
Major Component, or to implement a Standard Interface for which an
|
||||||
|
implementation is available to the public in source code form. A
|
||||||
|
"Major Component", in this context, means a major essential component
|
||||||
|
(kernel, window system, and so on) of the specific operating system
|
||||||
|
(if any) on which the executable work runs, or a compiler used to
|
||||||
|
produce the work, or an object code interpreter used to run it.
|
||||||
|
|
||||||
|
The "Corresponding Source" for a work in object code form means all
|
||||||
|
the source code needed to generate, install, and (for an executable
|
||||||
|
work) run the object code and to modify the work, including scripts to
|
||||||
|
control those activities. However, it does not include the work's
|
||||||
|
System Libraries, or general-purpose tools or generally available free
|
||||||
|
programs which are used unmodified in performing those activities but
|
||||||
|
which are not part of the work. For example, Corresponding Source
|
||||||
|
includes interface definition files associated with source files for
|
||||||
|
the work, and the source code for shared libraries and dynamically
|
||||||
|
linked subprograms that the work is specifically designed to require,
|
||||||
|
such as by intimate data communication or control flow between those
|
||||||
|
subprograms and other parts of the work.
|
||||||
|
|
||||||
|
The Corresponding Source need not include anything that users
|
||||||
|
can regenerate automatically from other parts of the Corresponding
|
||||||
|
Source.
|
||||||
|
|
||||||
|
The Corresponding Source for a work in source code form is that
|
||||||
|
same work.
|
||||||
|
|
||||||
|
2. Basic Permissions.
|
||||||
|
|
||||||
|
All rights granted under this License are granted for the term of
|
||||||
|
copyright on the Program, and are irrevocable provided the stated
|
||||||
|
conditions are met. This License explicitly affirms your unlimited
|
||||||
|
permission to run the unmodified Program. The output from running a
|
||||||
|
covered work is covered by this License only if the output, given its
|
||||||
|
content, constitutes a covered work. This License acknowledges your
|
||||||
|
rights of fair use or other equivalent, as provided by copyright law.
|
||||||
|
|
||||||
|
You may make, run and propagate covered works that you do not
|
||||||
|
convey, without conditions so long as your license otherwise remains
|
||||||
|
in force. You may convey covered works to others for the sole purpose
|
||||||
|
of having them make modifications exclusively for you, or provide you
|
||||||
|
with facilities for running those works, provided that you comply with
|
||||||
|
the terms of this License in conveying all material for which you do
|
||||||
|
not control copyright. Those thus making or running the covered works
|
||||||
|
for you must do so exclusively on your behalf, under your direction
|
||||||
|
and control, on terms that prohibit them from making any copies of
|
||||||
|
your copyrighted material outside their relationship with you.
|
||||||
|
|
||||||
|
Conveying under any other circumstances is permitted solely under
|
||||||
|
the conditions stated below. Sublicensing is not allowed; section 10
|
||||||
|
makes it unnecessary.
|
||||||
|
|
||||||
|
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
||||||
|
|
||||||
|
No covered work shall be deemed part of an effective technological
|
||||||
|
measure under any applicable law fulfilling obligations under article
|
||||||
|
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||||
|
similar laws prohibiting or restricting circumvention of such
|
||||||
|
measures.
|
||||||
|
|
||||||
|
When you convey a covered work, you waive any legal power to forbid
|
||||||
|
circumvention of technological measures to the extent such circumvention
|
||||||
|
is effected by exercising rights under this License with respect to
|
||||||
|
the covered work, and you disclaim any intention to limit operation or
|
||||||
|
modification of the work as a means of enforcing, against the work's
|
||||||
|
users, your or third parties' legal rights to forbid circumvention of
|
||||||
|
technological measures.
|
||||||
|
|
||||||
|
4. Conveying Verbatim Copies.
|
||||||
|
|
||||||
|
You may convey verbatim copies of the Program's source code as you
|
||||||
|
receive it, in any medium, provided that you conspicuously and
|
||||||
|
appropriately publish on each copy an appropriate copyright notice;
|
||||||
|
keep intact all notices stating that this License and any
|
||||||
|
non-permissive terms added in accord with section 7 apply to the code;
|
||||||
|
keep intact all notices of the absence of any warranty; and give all
|
||||||
|
recipients a copy of this License along with the Program.
|
||||||
|
|
||||||
|
You may charge any price or no price for each copy that you convey,
|
||||||
|
and you may offer support or warranty protection for a fee.
|
||||||
|
|
||||||
|
5. Conveying Modified Source Versions.
|
||||||
|
|
||||||
|
You may convey a work based on the Program, or the modifications to
|
||||||
|
produce it from the Program, in the form of source code under the
|
||||||
|
terms of section 4, provided that you also meet all of these conditions:
|
||||||
|
|
||||||
|
a) The work must carry prominent notices stating that you modified
|
||||||
|
it, and giving a relevant date.
|
||||||
|
|
||||||
|
b) The work must carry prominent notices stating that it is
|
||||||
|
released under this License and any conditions added under section
|
||||||
|
7. This requirement modifies the requirement in section 4 to
|
||||||
|
"keep intact all notices".
|
||||||
|
|
||||||
|
c) You must license the entire work, as a whole, under this
|
||||||
|
License to anyone who comes into possession of a copy. This
|
||||||
|
License will therefore apply, along with any applicable section 7
|
||||||
|
additional terms, to the whole of the work, and all its parts,
|
||||||
|
regardless of how they are packaged. This License gives no
|
||||||
|
permission to license the work in any other way, but it does not
|
||||||
|
invalidate such permission if you have separately received it.
|
||||||
|
|
||||||
|
d) If the work has interactive user interfaces, each must display
|
||||||
|
Appropriate Legal Notices; however, if the Program has interactive
|
||||||
|
interfaces that do not display Appropriate Legal Notices, your
|
||||||
|
work need not make them do so.
|
||||||
|
|
||||||
|
A compilation of a covered work with other separate and independent
|
||||||
|
works, which are not by their nature extensions of the covered work,
|
||||||
|
and which are not combined with it such as to form a larger program,
|
||||||
|
in or on a volume of a storage or distribution medium, is called an
|
||||||
|
"aggregate" if the compilation and its resulting copyright are not
|
||||||
|
used to limit the access or legal rights of the compilation's users
|
||||||
|
beyond what the individual works permit. Inclusion of a covered work
|
||||||
|
in an aggregate does not cause this License to apply to the other
|
||||||
|
parts of the aggregate.
|
||||||
|
|
||||||
|
6. Conveying Non-Source Forms.
|
||||||
|
|
||||||
|
You may convey a covered work in object code form under the terms
|
||||||
|
of sections 4 and 5, provided that you also convey the
|
||||||
|
machine-readable Corresponding Source under the terms of this License,
|
||||||
|
in one of these ways:
|
||||||
|
|
||||||
|
a) Convey the object code in, or embodied in, a physical product
|
||||||
|
(including a physical distribution medium), accompanied by the
|
||||||
|
Corresponding Source fixed on a durable physical medium
|
||||||
|
customarily used for software interchange.
|
||||||
|
|
||||||
|
b) Convey the object code in, or embodied in, a physical product
|
||||||
|
(including a physical distribution medium), accompanied by a
|
||||||
|
written offer, valid for at least three years and valid for as
|
||||||
|
long as you offer spare parts or customer support for that product
|
||||||
|
model, to give anyone who possesses the object code either (1) a
|
||||||
|
copy of the Corresponding Source for all the software in the
|
||||||
|
product that is covered by this License, on a durable physical
|
||||||
|
medium customarily used for software interchange, for a price no
|
||||||
|
more than your reasonable cost of physically performing this
|
||||||
|
conveying of source, or (2) access to copy the
|
||||||
|
Corresponding Source from a network server at no charge.
|
||||||
|
|
||||||
|
c) Convey individual copies of the object code with a copy of the
|
||||||
|
written offer to provide the Corresponding Source. This
|
||||||
|
alternative is allowed only occasionally and noncommercially, and
|
||||||
|
only if you received the object code with such an offer, in accord
|
||||||
|
with subsection 6b.
|
||||||
|
|
||||||
|
d) Convey the object code by offering access from a designated
|
||||||
|
place (gratis or for a charge), and offer equivalent access to the
|
||||||
|
Corresponding Source in the same way through the same place at no
|
||||||
|
further charge. You need not require recipients to copy the
|
||||||
|
Corresponding Source along with the object code. If the place to
|
||||||
|
copy the object code is a network server, the Corresponding Source
|
||||||
|
may be on a different server (operated by you or a third party)
|
||||||
|
that supports equivalent copying facilities, provided you maintain
|
||||||
|
clear directions next to the object code saying where to find the
|
||||||
|
Corresponding Source. Regardless of what server hosts the
|
||||||
|
Corresponding Source, you remain obligated to ensure that it is
|
||||||
|
available for as long as needed to satisfy these requirements.
|
||||||
|
|
||||||
|
e) Convey the object code using peer-to-peer transmission, provided
|
||||||
|
you inform other peers where the object code and Corresponding
|
||||||
|
Source of the work are being offered to the general public at no
|
||||||
|
charge under subsection 6d.
|
||||||
|
|
||||||
|
A separable portion of the object code, whose source code is excluded
|
||||||
|
from the Corresponding Source as a System Library, need not be
|
||||||
|
included in conveying the object code work.
|
||||||
|
|
||||||
|
A "User Product" is either (1) a "consumer product", which means any
|
||||||
|
tangible personal property which is normally used for personal, family,
|
||||||
|
or household purposes, or (2) anything designed or sold for incorporation
|
||||||
|
into a dwelling. In determining whether a product is a consumer product,
|
||||||
|
doubtful cases shall be resolved in favor of coverage. For a particular
|
||||||
|
product received by a particular user, "normally used" refers to a
|
||||||
|
typical or common use of that class of product, regardless of the status
|
||||||
|
of the particular user or of the way in which the particular user
|
||||||
|
actually uses, or expects or is expected to use, the product. A product
|
||||||
|
is a consumer product regardless of whether the product has substantial
|
||||||
|
commercial, industrial or non-consumer uses, unless such uses represent
|
||||||
|
the only significant mode of use of the product.
|
||||||
|
|
||||||
|
"Installation Information" for a User Product means any methods,
|
||||||
|
procedures, authorization keys, or other information required to install
|
||||||
|
and execute modified versions of a covered work in that User Product from
|
||||||
|
a modified version of its Corresponding Source. The information must
|
||||||
|
suffice to ensure that the continued functioning of the modified object
|
||||||
|
code is in no case prevented or interfered with solely because
|
||||||
|
modification has been made.
|
||||||
|
|
||||||
|
If you convey an object code work under this section in, or with, or
|
||||||
|
specifically for use in, a User Product, and the conveying occurs as
|
||||||
|
part of a transaction in which the right of possession and use of the
|
||||||
|
User Product is transferred to the recipient in perpetuity or for a
|
||||||
|
fixed term (regardless of how the transaction is characterized), the
|
||||||
|
Corresponding Source conveyed under this section must be accompanied
|
||||||
|
by the Installation Information. But this requirement does not apply
|
||||||
|
if neither you nor any third party retains the ability to install
|
||||||
|
modified object code on the User Product (for example, the work has
|
||||||
|
been installed in ROM).
|
||||||
|
|
||||||
|
The requirement to provide Installation Information does not include a
|
||||||
|
requirement to continue to provide support service, warranty, or updates
|
||||||
|
for a work that has been modified or installed by the recipient, or for
|
||||||
|
the User Product in which it has been modified or installed. Access to a
|
||||||
|
network may be denied when the modification itself materially and
|
||||||
|
adversely affects the operation of the network or violates the rules and
|
||||||
|
protocols for communication across the network.
|
||||||
|
|
||||||
|
Corresponding Source conveyed, and Installation Information provided,
|
||||||
|
in accord with this section must be in a format that is publicly
|
||||||
|
documented (and with an implementation available to the public in
|
||||||
|
source code form), and must require no special password or key for
|
||||||
|
unpacking, reading or copying.
|
||||||
|
|
||||||
|
7. Additional Terms.
|
||||||
|
|
||||||
|
"Additional permissions" are terms that supplement the terms of this
|
||||||
|
License by making exceptions from one or more of its conditions.
|
||||||
|
Additional permissions that are applicable to the entire Program shall
|
||||||
|
be treated as though they were included in this License, to the extent
|
||||||
|
that they are valid under applicable law. If additional permissions
|
||||||
|
apply only to part of the Program, that part may be used separately
|
||||||
|
under those permissions, but the entire Program remains governed by
|
||||||
|
this License without regard to the additional permissions.
|
||||||
|
|
||||||
|
When you convey a copy of a covered work, you may at your option
|
||||||
|
remove any additional permissions from that copy, or from any part of
|
||||||
|
it. (Additional permissions may be written to require their own
|
||||||
|
removal in certain cases when you modify the work.) You may place
|
||||||
|
additional permissions on material, added by you to a covered work,
|
||||||
|
for which you have or can give appropriate copyright permission.
|
||||||
|
|
||||||
|
Notwithstanding any other provision of this License, for material you
|
||||||
|
add to a covered work, you may (if authorized by the copyright holders of
|
||||||
|
that material) supplement the terms of this License with terms:
|
||||||
|
|
||||||
|
a) Disclaiming warranty or limiting liability differently from the
|
||||||
|
terms of sections 15 and 16 of this License; or
|
||||||
|
|
||||||
|
b) Requiring preservation of specified reasonable legal notices or
|
||||||
|
author attributions in that material or in the Appropriate Legal
|
||||||
|
Notices displayed by works containing it; or
|
||||||
|
|
||||||
|
c) Prohibiting misrepresentation of the origin of that material, or
|
||||||
|
requiring that modified versions of such material be marked in
|
||||||
|
reasonable ways as different from the original version; or
|
||||||
|
|
||||||
|
d) Limiting the use for publicity purposes of names of licensors or
|
||||||
|
authors of the material; or
|
||||||
|
|
||||||
|
e) Declining to grant rights under trademark law for use of some
|
||||||
|
trade names, trademarks, or service marks; or
|
||||||
|
|
||||||
|
f) Requiring indemnification of licensors and authors of that
|
||||||
|
material by anyone who conveys the material (or modified versions of
|
||||||
|
it) with contractual assumptions of liability to the recipient, for
|
||||||
|
any liability that these contractual assumptions directly impose on
|
||||||
|
those licensors and authors.
|
||||||
|
|
||||||
|
All other non-permissive additional terms are considered "further
|
||||||
|
restrictions" within the meaning of section 10. If the Program as you
|
||||||
|
received it, or any part of it, contains a notice stating that it is
|
||||||
|
governed by this License along with a term that is a further
|
||||||
|
restriction, you may remove that term. If a license document contains
|
||||||
|
a further restriction but permits relicensing or conveying under this
|
||||||
|
License, you may add to a covered work material governed by the terms
|
||||||
|
of that license document, provided that the further restriction does
|
||||||
|
not survive such relicensing or conveying.
|
||||||
|
|
||||||
|
If you add terms to a covered work in accord with this section, you
|
||||||
|
must place, in the relevant source files, a statement of the
|
||||||
|
additional terms that apply to those files, or a notice indicating
|
||||||
|
where to find the applicable terms.
|
||||||
|
|
||||||
|
Additional terms, permissive or non-permissive, may be stated in the
|
||||||
|
form of a separately written license, or stated as exceptions;
|
||||||
|
the above requirements apply either way.
|
||||||
|
|
||||||
|
8. Termination.
|
||||||
|
|
||||||
|
You may not propagate or modify a covered work except as expressly
|
||||||
|
provided under this License. Any attempt otherwise to propagate or
|
||||||
|
modify it is void, and will automatically terminate your rights under
|
||||||
|
this License (including any patent licenses granted under the third
|
||||||
|
paragraph of section 11).
|
||||||
|
|
||||||
|
However, if you cease all violation of this License, then your
|
||||||
|
license from a particular copyright holder is reinstated (a)
|
||||||
|
provisionally, unless and until the copyright holder explicitly and
|
||||||
|
finally terminates your license, and (b) permanently, if the copyright
|
||||||
|
holder fails to notify you of the violation by some reasonable means
|
||||||
|
prior to 60 days after the cessation.
|
||||||
|
|
||||||
|
Moreover, your license from a particular copyright holder is
|
||||||
|
reinstated permanently if the copyright holder notifies you of the
|
||||||
|
violation by some reasonable means, this is the first time you have
|
||||||
|
received notice of violation of this License (for any work) from that
|
||||||
|
copyright holder, and you cure the violation prior to 30 days after
|
||||||
|
your receipt of the notice.
|
||||||
|
|
||||||
|
Termination of your rights under this section does not terminate the
|
||||||
|
licenses of parties who have received copies or rights from you under
|
||||||
|
this License. If your rights have been terminated and not permanently
|
||||||
|
reinstated, you do not qualify to receive new licenses for the same
|
||||||
|
material under section 10.
|
||||||
|
|
||||||
|
9. Acceptance Not Required for Having Copies.
|
||||||
|
|
||||||
|
You are not required to accept this License in order to receive or
|
||||||
|
run a copy of the Program. Ancillary propagation of a covered work
|
||||||
|
occurring solely as a consequence of using peer-to-peer transmission
|
||||||
|
to receive a copy likewise does not require acceptance. However,
|
||||||
|
nothing other than this License grants you permission to propagate or
|
||||||
|
modify any covered work. These actions infringe copyright if you do
|
||||||
|
not accept this License. Therefore, by modifying or propagating a
|
||||||
|
covered work, you indicate your acceptance of this License to do so.
|
||||||
|
|
||||||
|
10. Automatic Licensing of Downstream Recipients.
|
||||||
|
|
||||||
|
Each time you convey a covered work, the recipient automatically
|
||||||
|
receives a license from the original licensors, to run, modify and
|
||||||
|
propagate that work, subject to this License. You are not responsible
|
||||||
|
for enforcing compliance by third parties with this License.
|
||||||
|
|
||||||
|
An "entity transaction" is a transaction transferring control of an
|
||||||
|
organization, or substantially all assets of one, or subdividing an
|
||||||
|
organization, or merging organizations. If propagation of a covered
|
||||||
|
work results from an entity transaction, each party to that
|
||||||
|
transaction who receives a copy of the work also receives whatever
|
||||||
|
licenses to the work the party's predecessor in interest had or could
|
||||||
|
give under the previous paragraph, plus a right to possession of the
|
||||||
|
Corresponding Source of the work from the predecessor in interest, if
|
||||||
|
the predecessor has it or can get it with reasonable efforts.
|
||||||
|
|
||||||
|
You may not impose any further restrictions on the exercise of the
|
||||||
|
rights granted or affirmed under this License. For example, you may
|
||||||
|
not impose a license fee, royalty, or other charge for exercise of
|
||||||
|
rights granted under this License, and you may not initiate litigation
|
||||||
|
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||||
|
any patent claim is infringed by making, using, selling, offering for
|
||||||
|
sale, or importing the Program or any portion of it.
|
||||||
|
|
||||||
|
11. Patents.
|
||||||
|
|
||||||
|
A "contributor" is a copyright holder who authorizes use under this
|
||||||
|
License of the Program or a work on which the Program is based. The
|
||||||
|
work thus licensed is called the contributor's "contributor version".
|
||||||
|
|
||||||
|
A contributor's "essential patent claims" are all patent claims
|
||||||
|
owned or controlled by the contributor, whether already acquired or
|
||||||
|
hereafter acquired, that would be infringed by some manner, permitted
|
||||||
|
by this License, of making, using, or selling its contributor version,
|
||||||
|
but do not include claims that would be infringed only as a
|
||||||
|
consequence of further modification of the contributor version. For
|
||||||
|
purposes of this definition, "control" includes the right to grant
|
||||||
|
patent sublicenses in a manner consistent with the requirements of
|
||||||
|
this License.
|
||||||
|
|
||||||
|
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||||
|
patent license under the contributor's essential patent claims, to
|
||||||
|
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||||
|
propagate the contents of its contributor version.
|
||||||
|
|
||||||
|
In the following three paragraphs, a "patent license" is any express
|
||||||
|
agreement or commitment, however denominated, not to enforce a patent
|
||||||
|
(such as an express permission to practice a patent or covenant not to
|
||||||
|
sue for patent infringement). To "grant" such a patent license to a
|
||||||
|
party means to make such an agreement or commitment not to enforce a
|
||||||
|
patent against the party.
|
||||||
|
|
||||||
|
If you convey a covered work, knowingly relying on a patent license,
|
||||||
|
and the Corresponding Source of the work is not available for anyone
|
||||||
|
to copy, free of charge and under the terms of this License, through a
|
||||||
|
publicly available network server or other readily accessible means,
|
||||||
|
then you must either (1) cause the Corresponding Source to be so
|
||||||
|
available, or (2) arrange to deprive yourself of the benefit of the
|
||||||
|
patent license for this particular work, or (3) arrange, in a manner
|
||||||
|
consistent with the requirements of this License, to extend the patent
|
||||||
|
license to downstream recipients. "Knowingly relying" means you have
|
||||||
|
actual knowledge that, but for the patent license, your conveying the
|
||||||
|
covered work in a country, or your recipient's use of the covered work
|
||||||
|
in a country, would infringe one or more identifiable patents in that
|
||||||
|
country that you have reason to believe are valid.
|
||||||
|
|
||||||
|
If, pursuant to or in connection with a single transaction or
|
||||||
|
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||||
|
covered work, and grant a patent license to some of the parties
|
||||||
|
receiving the covered work authorizing them to use, propagate, modify
|
||||||
|
or convey a specific copy of the covered work, then the patent license
|
||||||
|
you grant is automatically extended to all recipients of the covered
|
||||||
|
work and works based on it.
|
||||||
|
|
||||||
|
A patent license is "discriminatory" if it does not include within
|
||||||
|
the scope of its coverage, prohibits the exercise of, or is
|
||||||
|
conditioned on the non-exercise of one or more of the rights that are
|
||||||
|
specifically granted under this License. You may not convey a covered
|
||||||
|
work if you are a party to an arrangement with a third party that is
|
||||||
|
in the business of distributing software, under which you make payment
|
||||||
|
to the third party based on the extent of your activity of conveying
|
||||||
|
the work, and under which the third party grants, to any of the
|
||||||
|
parties who would receive the covered work from you, a discriminatory
|
||||||
|
patent license (a) in connection with copies of the covered work
|
||||||
|
conveyed by you (or copies made from those copies), or (b) primarily
|
||||||
|
for and in connection with specific products or compilations that
|
||||||
|
contain the covered work, unless you entered into that arrangement,
|
||||||
|
or that patent license was granted, prior to 28 March 2007.
|
||||||
|
|
||||||
|
Nothing in this License shall be construed as excluding or limiting
|
||||||
|
any implied license or other defenses to infringement that may
|
||||||
|
otherwise be available to you under applicable patent law.
|
||||||
|
|
||||||
|
12. No Surrender of Others' Freedom.
|
||||||
|
|
||||||
|
If conditions are imposed on you (whether by court order, agreement or
|
||||||
|
otherwise) that contradict the conditions of this License, they do not
|
||||||
|
excuse you from the conditions of this License. If you cannot convey a
|
||||||
|
covered work so as to satisfy simultaneously your obligations under this
|
||||||
|
License and any other pertinent obligations, then as a consequence you may
|
||||||
|
not convey it at all. For example, if you agree to terms that obligate you
|
||||||
|
to collect a royalty for further conveying from those to whom you convey
|
||||||
|
the Program, the only way you could satisfy both those terms and this
|
||||||
|
License would be to refrain entirely from conveying the Program.
|
||||||
|
|
||||||
|
13. Remote Network Interaction; Use with the GNU General Public License.
|
||||||
|
|
||||||
|
Notwithstanding any other provision of this License, if you modify the
|
||||||
|
Program, your modified version must prominently offer all users
|
||||||
|
interacting with it remotely through a computer network (if your version
|
||||||
|
supports such interaction) an opportunity to receive the Corresponding
|
||||||
|
Source of your version by providing access to the Corresponding Source
|
||||||
|
from a network server at no charge, through some standard or customary
|
||||||
|
means of facilitating copying of software. This Corresponding Source
|
||||||
|
shall include the Corresponding Source for any work covered by version 3
|
||||||
|
of the GNU General Public License that is incorporated pursuant to the
|
||||||
|
following paragraph.
|
||||||
|
|
||||||
|
Notwithstanding any other provision of this License, you have
|
||||||
|
permission to link or combine any covered work with a work licensed
|
||||||
|
under version 3 of the GNU General Public License into a single
|
||||||
|
combined work, and to convey the resulting work. The terms of this
|
||||||
|
License will continue to apply to the part which is the covered work,
|
||||||
|
but the work with which it is combined will remain governed by version
|
||||||
|
3 of the GNU General Public License.
|
||||||
|
|
||||||
|
14. Revised Versions of this License.
|
||||||
|
|
||||||
|
The Free Software Foundation may publish revised and/or new versions of
|
||||||
|
the GNU Affero General Public License from time to time. Such new versions
|
||||||
|
will be similar in spirit to the present version, but may differ in detail to
|
||||||
|
address new problems or concerns.
|
||||||
|
|
||||||
|
Each version is given a distinguishing version number. If the
|
||||||
|
Program specifies that a certain numbered version of the GNU Affero General
|
||||||
|
Public License "or any later version" applies to it, you have the
|
||||||
|
option of following the terms and conditions either of that numbered
|
||||||
|
version or of any later version published by the Free Software
|
||||||
|
Foundation. If the Program does not specify a version number of the
|
||||||
|
GNU Affero General Public License, you may choose any version ever published
|
||||||
|
by the Free Software Foundation.
|
||||||
|
|
||||||
|
If the Program specifies that a proxy can decide which future
|
||||||
|
versions of the GNU Affero General Public License can be used, that proxy's
|
||||||
|
public statement of acceptance of a version permanently authorizes you
|
||||||
|
to choose that version for the Program.
|
||||||
|
|
||||||
|
Later license versions may give you additional or different
|
||||||
|
permissions. However, no additional obligations are imposed on any
|
||||||
|
author or copyright holder as a result of your choosing to follow a
|
||||||
|
later version.
|
||||||
|
|
||||||
|
15. Disclaimer of Warranty.
|
||||||
|
|
||||||
|
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||||
|
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||||
|
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
||||||
|
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
||||||
|
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||||
|
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
||||||
|
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
||||||
|
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||||
|
|
||||||
|
16. Limitation of Liability.
|
||||||
|
|
||||||
|
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||||
|
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
||||||
|
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
||||||
|
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
||||||
|
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
||||||
|
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
||||||
|
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
||||||
|
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
||||||
|
SUCH DAMAGES.
|
||||||
|
|
||||||
|
17. Interpretation of Sections 15 and 16.
|
||||||
|
|
||||||
|
If the disclaimer of warranty and limitation of liability provided
|
||||||
|
above cannot be given local legal effect according to their terms,
|
||||||
|
reviewing courts shall apply local law that most closely approximates
|
||||||
|
an absolute waiver of all civil liability in connection with the
|
||||||
|
Program, unless a warranty or assumption of liability accompanies a
|
||||||
|
copy of the Program in return for a fee.
|
||||||
|
|
||||||
|
END OF TERMS AND CONDITIONS
|
||||||
|
|
||||||
|
How to Apply These Terms to Your New Programs
|
||||||
|
|
||||||
|
If you develop a new program, and you want it to be of the greatest
|
||||||
|
possible use to the public, the best way to achieve this is to make it
|
||||||
|
free software which everyone can redistribute and change under these terms.
|
||||||
|
|
||||||
|
To do so, attach the following notices to the program. It is safest
|
||||||
|
to attach them to the start of each source file to most effectively
|
||||||
|
state the exclusion of warranty; and each file should have at least
|
||||||
|
the "copyright" line and a pointer to where the full notice is found.
|
||||||
|
|
||||||
|
<one line to give the program's name and a brief idea of what it does.>
|
||||||
|
Copyright (C) <year> <name of author>
|
||||||
|
|
||||||
|
This program is free software: you can redistribute it and/or modify
|
||||||
|
it under the terms of the GNU Affero General Public License as published by
|
||||||
|
the Free Software Foundation, either version 3 of the License, or
|
||||||
|
(at your option) any later version.
|
||||||
|
|
||||||
|
This program is distributed in the hope that it will be useful,
|
||||||
|
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
GNU Affero General Public License for more details.
|
||||||
|
|
||||||
|
You should have received a copy of the GNU Affero General Public License
|
||||||
|
along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
Also add information on how to contact you by electronic and paper mail.
|
||||||
|
|
||||||
|
If your software can interact with users remotely through a computer
|
||||||
|
network, you should also make sure that it provides a way for users to
|
||||||
|
get its source. For example, if your program is a web application, its
|
||||||
|
interface could display a "Source" link that leads users to an archive
|
||||||
|
of the code. There are many ways you could offer source, and different
|
||||||
|
solutions will be better for different programs; see section 13 for the
|
||||||
|
specific requirements.
|
||||||
|
|
||||||
|
You should also get your employer (if you work as a programmer) or school,
|
||||||
|
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
||||||
|
For more information on this, and how to apply and follow the GNU AGPL, see
|
||||||
|
<https://www.gnu.org/licenses/>.
|
||||||
@@ -16,6 +16,7 @@ from flask_wtf.csrf import CSRFError
|
|||||||
from werkzeug.middleware.proxy_fix import ProxyFix
|
from werkzeug.middleware.proxy_fix import ProxyFix
|
||||||
|
|
||||||
from .access_logging import AccessLoggingService
|
from .access_logging import AccessLoggingService
|
||||||
|
from .compression import GzipMiddleware
|
||||||
from .acl import AclService
|
from .acl import AclService
|
||||||
from .bucket_policies import BucketPolicyStore
|
from .bucket_policies import BucketPolicyStore
|
||||||
from .config import AppConfig
|
from .config import AppConfig
|
||||||
@@ -89,13 +90,24 @@ def create_app(
|
|||||||
# Trust X-Forwarded-* headers from proxies
|
# Trust X-Forwarded-* headers from proxies
|
||||||
app.wsgi_app = ProxyFix(app.wsgi_app, x_for=1, x_proto=1, x_host=1, x_prefix=1)
|
app.wsgi_app = ProxyFix(app.wsgi_app, x_for=1, x_proto=1, x_host=1, x_prefix=1)
|
||||||
|
|
||||||
|
# Enable gzip compression for responses (10-20x smaller JSON payloads)
|
||||||
|
if app.config.get("ENABLE_GZIP", True):
|
||||||
|
app.wsgi_app = GzipMiddleware(app.wsgi_app, compression_level=6)
|
||||||
|
|
||||||
_configure_cors(app)
|
_configure_cors(app)
|
||||||
_configure_logging(app)
|
_configure_logging(app)
|
||||||
|
|
||||||
limiter.init_app(app)
|
limiter.init_app(app)
|
||||||
csrf.init_app(app)
|
csrf.init_app(app)
|
||||||
|
|
||||||
storage = ObjectStorage(Path(app.config["STORAGE_ROOT"]))
|
storage = ObjectStorage(
|
||||||
|
Path(app.config["STORAGE_ROOT"]),
|
||||||
|
cache_ttl=app.config.get("OBJECT_CACHE_TTL", 5),
|
||||||
|
)
|
||||||
|
|
||||||
|
if app.config.get("WARM_CACHE_ON_STARTUP", True) and not app.config.get("TESTING"):
|
||||||
|
storage.warm_cache_async()
|
||||||
|
|
||||||
iam = IamService(
|
iam = IamService(
|
||||||
Path(app.config["IAM_CONFIG"]),
|
Path(app.config["IAM_CONFIG"]),
|
||||||
auth_max_attempts=app.config.get("AUTH_MAX_ATTEMPTS", 5),
|
auth_max_attempts=app.config.get("AUTH_MAX_ATTEMPTS", 5),
|
||||||
@@ -124,7 +136,7 @@ def create_app(
|
|||||||
)
|
)
|
||||||
|
|
||||||
connections = ConnectionStore(connections_path)
|
connections = ConnectionStore(connections_path)
|
||||||
replication = ReplicationManager(storage, connections, replication_rules_path)
|
replication = ReplicationManager(storage, connections, replication_rules_path, storage_root)
|
||||||
|
|
||||||
encryption_config = {
|
encryption_config = {
|
||||||
"encryption_enabled": app.config.get("ENCRYPTION_ENABLED", False),
|
"encryption_enabled": app.config.get("ENCRYPTION_ENABLED", False),
|
||||||
@@ -156,6 +168,7 @@ def create_app(
|
|||||||
lifecycle_manager = LifecycleManager(
|
lifecycle_manager = LifecycleManager(
|
||||||
base_storage,
|
base_storage,
|
||||||
interval_seconds=app.config.get("LIFECYCLE_INTERVAL_SECONDS", 3600),
|
interval_seconds=app.config.get("LIFECYCLE_INTERVAL_SECONDS", 3600),
|
||||||
|
storage_root=storage_root,
|
||||||
)
|
)
|
||||||
lifecycle_manager.start()
|
lifecycle_manager.start()
|
||||||
|
|
||||||
@@ -289,17 +302,17 @@ def _configure_logging(app: Flask) -> None:
|
|||||||
formatter = logging.Formatter(
|
formatter = logging.Formatter(
|
||||||
"%(asctime)s | %(levelname)s | %(request_id)s | %(method)s %(path)s | %(message)s"
|
"%(asctime)s | %(levelname)s | %(request_id)s | %(method)s %(path)s | %(message)s"
|
||||||
)
|
)
|
||||||
|
|
||||||
# Stream Handler (stdout) - Primary for Docker
|
|
||||||
stream_handler = logging.StreamHandler(sys.stdout)
|
stream_handler = logging.StreamHandler(sys.stdout)
|
||||||
stream_handler.setFormatter(formatter)
|
stream_handler.setFormatter(formatter)
|
||||||
stream_handler.addFilter(_RequestContextFilter())
|
stream_handler.addFilter(_RequestContextFilter())
|
||||||
|
|
||||||
logger = app.logger
|
logger = app.logger
|
||||||
|
for handler in logger.handlers[:]:
|
||||||
|
handler.close()
|
||||||
logger.handlers.clear()
|
logger.handlers.clear()
|
||||||
logger.addHandler(stream_handler)
|
logger.addHandler(stream_handler)
|
||||||
|
|
||||||
# File Handler (optional, if configured)
|
|
||||||
if app.config.get("LOG_TO_FILE"):
|
if app.config.get("LOG_TO_FILE"):
|
||||||
log_file = Path(app.config["LOG_FILE"])
|
log_file = Path(app.config["LOG_FILE"])
|
||||||
log_file.parent.mkdir(parents=True, exist_ok=True)
|
log_file.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
|||||||
@@ -196,18 +196,21 @@ class AccessLoggingService:
|
|||||||
)
|
)
|
||||||
|
|
||||||
target_key = f"{config.target_bucket}:{config.target_prefix}"
|
target_key = f"{config.target_bucket}:{config.target_prefix}"
|
||||||
|
should_flush = False
|
||||||
with self._buffer_lock:
|
with self._buffer_lock:
|
||||||
if target_key not in self._buffer:
|
if target_key not in self._buffer:
|
||||||
self._buffer[target_key] = []
|
self._buffer[target_key] = []
|
||||||
self._buffer[target_key].append(entry)
|
self._buffer[target_key].append(entry)
|
||||||
|
should_flush = len(self._buffer[target_key]) >= self.max_buffer_size
|
||||||
|
|
||||||
if len(self._buffer[target_key]) >= self.max_buffer_size:
|
if should_flush:
|
||||||
self._flush_buffer(target_key)
|
self._flush_buffer(target_key)
|
||||||
|
|
||||||
def _flush_loop(self) -> None:
|
def _flush_loop(self) -> None:
|
||||||
while not self._shutdown.is_set():
|
while not self._shutdown.is_set():
|
||||||
time.sleep(self.flush_interval)
|
self._shutdown.wait(timeout=self.flush_interval)
|
||||||
self._flush_all()
|
if not self._shutdown.is_set():
|
||||||
|
self._flush_all()
|
||||||
|
|
||||||
def _flush_all(self) -> None:
|
def _flush_all(self) -> None:
|
||||||
with self._buffer_lock:
|
with self._buffer_lock:
|
||||||
|
|||||||
94
app/compression.py
Normal file
94
app/compression.py
Normal file
@@ -0,0 +1,94 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import gzip
|
||||||
|
import io
|
||||||
|
from typing import Callable, Iterable, List, Tuple
|
||||||
|
|
||||||
|
COMPRESSIBLE_MIMES = frozenset([
|
||||||
|
'application/json',
|
||||||
|
'application/javascript',
|
||||||
|
'application/xml',
|
||||||
|
'text/html',
|
||||||
|
'text/css',
|
||||||
|
'text/plain',
|
||||||
|
'text/xml',
|
||||||
|
'text/javascript',
|
||||||
|
'application/x-ndjson',
|
||||||
|
])
|
||||||
|
|
||||||
|
MIN_SIZE_FOR_COMPRESSION = 500
|
||||||
|
|
||||||
|
|
||||||
|
class GzipMiddleware:
|
||||||
|
def __init__(self, app: Callable, compression_level: int = 6, min_size: int = MIN_SIZE_FOR_COMPRESSION):
|
||||||
|
self.app = app
|
||||||
|
self.compression_level = compression_level
|
||||||
|
self.min_size = min_size
|
||||||
|
|
||||||
|
def __call__(self, environ: dict, start_response: Callable) -> Iterable[bytes]:
|
||||||
|
accept_encoding = environ.get('HTTP_ACCEPT_ENCODING', '')
|
||||||
|
if 'gzip' not in accept_encoding.lower():
|
||||||
|
return self.app(environ, start_response)
|
||||||
|
|
||||||
|
response_started = False
|
||||||
|
status_code = None
|
||||||
|
response_headers: List[Tuple[str, str]] = []
|
||||||
|
content_type = None
|
||||||
|
content_length = None
|
||||||
|
should_compress = False
|
||||||
|
exc_info_holder = [None]
|
||||||
|
|
||||||
|
def custom_start_response(status: str, headers: List[Tuple[str, str]], exc_info=None):
|
||||||
|
nonlocal response_started, status_code, response_headers, content_type, content_length, should_compress
|
||||||
|
response_started = True
|
||||||
|
status_code = int(status.split(' ', 1)[0])
|
||||||
|
response_headers = list(headers)
|
||||||
|
exc_info_holder[0] = exc_info
|
||||||
|
|
||||||
|
for name, value in headers:
|
||||||
|
name_lower = name.lower()
|
||||||
|
if name_lower == 'content-type':
|
||||||
|
content_type = value.split(';')[0].strip().lower()
|
||||||
|
elif name_lower == 'content-length':
|
||||||
|
content_length = int(value)
|
||||||
|
elif name_lower == 'content-encoding':
|
||||||
|
should_compress = False
|
||||||
|
return start_response(status, headers, exc_info)
|
||||||
|
|
||||||
|
if content_type and content_type in COMPRESSIBLE_MIMES:
|
||||||
|
if content_length is None or content_length >= self.min_size:
|
||||||
|
should_compress = True
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
response_body = b''.join(self.app(environ, custom_start_response))
|
||||||
|
|
||||||
|
if not response_started:
|
||||||
|
return [response_body]
|
||||||
|
|
||||||
|
if should_compress and len(response_body) >= self.min_size:
|
||||||
|
buf = io.BytesIO()
|
||||||
|
with gzip.GzipFile(fileobj=buf, mode='wb', compresslevel=self.compression_level) as gz:
|
||||||
|
gz.write(response_body)
|
||||||
|
compressed = buf.getvalue()
|
||||||
|
|
||||||
|
if len(compressed) < len(response_body):
|
||||||
|
response_body = compressed
|
||||||
|
new_headers = []
|
||||||
|
for name, value in response_headers:
|
||||||
|
if name.lower() not in ('content-length', 'content-encoding'):
|
||||||
|
new_headers.append((name, value))
|
||||||
|
new_headers.append(('Content-Encoding', 'gzip'))
|
||||||
|
new_headers.append(('Content-Length', str(len(response_body))))
|
||||||
|
new_headers.append(('Vary', 'Accept-Encoding'))
|
||||||
|
response_headers = new_headers
|
||||||
|
|
||||||
|
status_str = f"{status_code} " + {
|
||||||
|
200: "OK", 201: "Created", 204: "No Content", 206: "Partial Content",
|
||||||
|
301: "Moved Permanently", 302: "Found", 304: "Not Modified",
|
||||||
|
400: "Bad Request", 401: "Unauthorized", 403: "Forbidden", 404: "Not Found",
|
||||||
|
405: "Method Not Allowed", 409: "Conflict", 500: "Internal Server Error",
|
||||||
|
}.get(status_code, "Unknown")
|
||||||
|
|
||||||
|
start_response(status_str, response_headers, exc_info_holder[0])
|
||||||
|
return [response_body]
|
||||||
@@ -67,6 +67,7 @@ class AppConfig:
|
|||||||
stream_chunk_size: int
|
stream_chunk_size: int
|
||||||
multipart_min_part_size: int
|
multipart_min_part_size: int
|
||||||
bucket_stats_cache_ttl: int
|
bucket_stats_cache_ttl: int
|
||||||
|
object_cache_ttl: int
|
||||||
encryption_enabled: bool
|
encryption_enabled: bool
|
||||||
encryption_master_key_path: Path
|
encryption_master_key_path: Path
|
||||||
kms_enabled: bool
|
kms_enabled: bool
|
||||||
@@ -84,7 +85,7 @@ class AppConfig:
|
|||||||
return overrides.get(name, os.getenv(name, default))
|
return overrides.get(name, os.getenv(name, default))
|
||||||
|
|
||||||
storage_root = Path(_get("STORAGE_ROOT", PROJECT_ROOT / "data")).resolve()
|
storage_root = Path(_get("STORAGE_ROOT", PROJECT_ROOT / "data")).resolve()
|
||||||
max_upload_size = int(_get("MAX_UPLOAD_SIZE", 1024 * 1024 * 1024)) # 1 GiB default
|
max_upload_size = int(_get("MAX_UPLOAD_SIZE", 1024 * 1024 * 1024))
|
||||||
ui_page_size = int(_get("UI_PAGE_SIZE", 100))
|
ui_page_size = int(_get("UI_PAGE_SIZE", 100))
|
||||||
auth_max_attempts = int(_get("AUTH_MAX_ATTEMPTS", 5))
|
auth_max_attempts = int(_get("AUTH_MAX_ATTEMPTS", 5))
|
||||||
auth_lockout_minutes = int(_get("AUTH_LOCKOUT_MINUTES", 15))
|
auth_lockout_minutes = int(_get("AUTH_LOCKOUT_MINUTES", 15))
|
||||||
@@ -108,6 +109,10 @@ class AppConfig:
|
|||||||
try:
|
try:
|
||||||
secret_file.parent.mkdir(parents=True, exist_ok=True)
|
secret_file.parent.mkdir(parents=True, exist_ok=True)
|
||||||
secret_file.write_text(generated)
|
secret_file.write_text(generated)
|
||||||
|
try:
|
||||||
|
os.chmod(secret_file, 0o600)
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
secret_key = generated
|
secret_key = generated
|
||||||
except OSError:
|
except OSError:
|
||||||
secret_key = generated
|
secret_key = generated
|
||||||
@@ -157,8 +162,9 @@ class AppConfig:
|
|||||||
cors_allow_headers = _csv(str(_get("CORS_ALLOW_HEADERS", "*")), ["*"])
|
cors_allow_headers = _csv(str(_get("CORS_ALLOW_HEADERS", "*")), ["*"])
|
||||||
cors_expose_headers = _csv(str(_get("CORS_EXPOSE_HEADERS", "*")), ["*"])
|
cors_expose_headers = _csv(str(_get("CORS_EXPOSE_HEADERS", "*")), ["*"])
|
||||||
session_lifetime_days = int(_get("SESSION_LIFETIME_DAYS", 30))
|
session_lifetime_days = int(_get("SESSION_LIFETIME_DAYS", 30))
|
||||||
bucket_stats_cache_ttl = int(_get("BUCKET_STATS_CACHE_TTL", 60))
|
bucket_stats_cache_ttl = int(_get("BUCKET_STATS_CACHE_TTL", 60))
|
||||||
|
object_cache_ttl = int(_get("OBJECT_CACHE_TTL", 5))
|
||||||
|
|
||||||
encryption_enabled = str(_get("ENCRYPTION_ENABLED", "0")).lower() in {"1", "true", "yes", "on"}
|
encryption_enabled = str(_get("ENCRYPTION_ENABLED", "0")).lower() in {"1", "true", "yes", "on"}
|
||||||
encryption_keys_dir = storage_root / ".myfsio.sys" / "keys"
|
encryption_keys_dir = storage_root / ".myfsio.sys" / "keys"
|
||||||
encryption_master_key_path = Path(_get("ENCRYPTION_MASTER_KEY_PATH", encryption_keys_dir / "master.key")).resolve()
|
encryption_master_key_path = Path(_get("ENCRYPTION_MASTER_KEY_PATH", encryption_keys_dir / "master.key")).resolve()
|
||||||
@@ -196,6 +202,7 @@ class AppConfig:
|
|||||||
stream_chunk_size=stream_chunk_size,
|
stream_chunk_size=stream_chunk_size,
|
||||||
multipart_min_part_size=multipart_min_part_size,
|
multipart_min_part_size=multipart_min_part_size,
|
||||||
bucket_stats_cache_ttl=bucket_stats_cache_ttl,
|
bucket_stats_cache_ttl=bucket_stats_cache_ttl,
|
||||||
|
object_cache_ttl=object_cache_ttl,
|
||||||
encryption_enabled=encryption_enabled,
|
encryption_enabled=encryption_enabled,
|
||||||
encryption_master_key_path=encryption_master_key_path,
|
encryption_master_key_path=encryption_master_key_path,
|
||||||
kms_enabled=kms_enabled,
|
kms_enabled=kms_enabled,
|
||||||
@@ -311,6 +318,7 @@ class AppConfig:
|
|||||||
"STREAM_CHUNK_SIZE": self.stream_chunk_size,
|
"STREAM_CHUNK_SIZE": self.stream_chunk_size,
|
||||||
"MULTIPART_MIN_PART_SIZE": self.multipart_min_part_size,
|
"MULTIPART_MIN_PART_SIZE": self.multipart_min_part_size,
|
||||||
"BUCKET_STATS_CACHE_TTL": self.bucket_stats_cache_ttl,
|
"BUCKET_STATS_CACHE_TTL": self.bucket_stats_cache_ttl,
|
||||||
|
"OBJECT_CACHE_TTL": self.object_cache_ttl,
|
||||||
"LOG_LEVEL": self.log_level,
|
"LOG_LEVEL": self.log_level,
|
||||||
"LOG_TO_FILE": self.log_to_file,
|
"LOG_TO_FILE": self.log_to_file,
|
||||||
"LOG_FILE": str(self.log_path),
|
"LOG_FILE": str(self.log_path),
|
||||||
|
|||||||
21
app/iam.py
21
app/iam.py
@@ -26,14 +26,12 @@ IAM_ACTIONS = {
|
|||||||
ALLOWED_ACTIONS = (S3_ACTIONS | IAM_ACTIONS) | {"iam:*"}
|
ALLOWED_ACTIONS = (S3_ACTIONS | IAM_ACTIONS) | {"iam:*"}
|
||||||
|
|
||||||
ACTION_ALIASES = {
|
ACTION_ALIASES = {
|
||||||
# List actions
|
|
||||||
"list": "list",
|
"list": "list",
|
||||||
"s3:listbucket": "list",
|
"s3:listbucket": "list",
|
||||||
"s3:listallmybuckets": "list",
|
"s3:listallmybuckets": "list",
|
||||||
"s3:listbucketversions": "list",
|
"s3:listbucketversions": "list",
|
||||||
"s3:listmultipartuploads": "list",
|
"s3:listmultipartuploads": "list",
|
||||||
"s3:listparts": "list",
|
"s3:listparts": "list",
|
||||||
# Read actions
|
|
||||||
"read": "read",
|
"read": "read",
|
||||||
"s3:getobject": "read",
|
"s3:getobject": "read",
|
||||||
"s3:getobjectversion": "read",
|
"s3:getobjectversion": "read",
|
||||||
@@ -43,7 +41,6 @@ ACTION_ALIASES = {
|
|||||||
"s3:getbucketversioning": "read",
|
"s3:getbucketversioning": "read",
|
||||||
"s3:headobject": "read",
|
"s3:headobject": "read",
|
||||||
"s3:headbucket": "read",
|
"s3:headbucket": "read",
|
||||||
# Write actions
|
|
||||||
"write": "write",
|
"write": "write",
|
||||||
"s3:putobject": "write",
|
"s3:putobject": "write",
|
||||||
"s3:createbucket": "write",
|
"s3:createbucket": "write",
|
||||||
@@ -54,23 +51,19 @@ ACTION_ALIASES = {
|
|||||||
"s3:completemultipartupload": "write",
|
"s3:completemultipartupload": "write",
|
||||||
"s3:abortmultipartupload": "write",
|
"s3:abortmultipartupload": "write",
|
||||||
"s3:copyobject": "write",
|
"s3:copyobject": "write",
|
||||||
# Delete actions
|
|
||||||
"delete": "delete",
|
"delete": "delete",
|
||||||
"s3:deleteobject": "delete",
|
"s3:deleteobject": "delete",
|
||||||
"s3:deleteobjectversion": "delete",
|
"s3:deleteobjectversion": "delete",
|
||||||
"s3:deletebucket": "delete",
|
"s3:deletebucket": "delete",
|
||||||
"s3:deleteobjecttagging": "delete",
|
"s3:deleteobjecttagging": "delete",
|
||||||
# Share actions (ACL)
|
|
||||||
"share": "share",
|
"share": "share",
|
||||||
"s3:putobjectacl": "share",
|
"s3:putobjectacl": "share",
|
||||||
"s3:putbucketacl": "share",
|
"s3:putbucketacl": "share",
|
||||||
"s3:getbucketacl": "share",
|
"s3:getbucketacl": "share",
|
||||||
# Policy actions
|
|
||||||
"policy": "policy",
|
"policy": "policy",
|
||||||
"s3:putbucketpolicy": "policy",
|
"s3:putbucketpolicy": "policy",
|
||||||
"s3:getbucketpolicy": "policy",
|
"s3:getbucketpolicy": "policy",
|
||||||
"s3:deletebucketpolicy": "policy",
|
"s3:deletebucketpolicy": "policy",
|
||||||
# Replication actions
|
|
||||||
"replication": "replication",
|
"replication": "replication",
|
||||||
"s3:getreplicationconfiguration": "replication",
|
"s3:getreplicationconfiguration": "replication",
|
||||||
"s3:putreplicationconfiguration": "replication",
|
"s3:putreplicationconfiguration": "replication",
|
||||||
@@ -78,7 +71,6 @@ ACTION_ALIASES = {
|
|||||||
"s3:replicateobject": "replication",
|
"s3:replicateobject": "replication",
|
||||||
"s3:replicatetags": "replication",
|
"s3:replicatetags": "replication",
|
||||||
"s3:replicatedelete": "replication",
|
"s3:replicatedelete": "replication",
|
||||||
# IAM actions
|
|
||||||
"iam:listusers": "iam:list_users",
|
"iam:listusers": "iam:list_users",
|
||||||
"iam:createuser": "iam:create_user",
|
"iam:createuser": "iam:create_user",
|
||||||
"iam:deleteuser": "iam:delete_user",
|
"iam:deleteuser": "iam:delete_user",
|
||||||
@@ -115,17 +107,15 @@ class IamService:
|
|||||||
self._raw_config: Dict[str, Any] = {}
|
self._raw_config: Dict[str, Any] = {}
|
||||||
self._failed_attempts: Dict[str, Deque[datetime]] = {}
|
self._failed_attempts: Dict[str, Deque[datetime]] = {}
|
||||||
self._last_load_time = 0.0
|
self._last_load_time = 0.0
|
||||||
# Performance: credential cache with TTL
|
|
||||||
self._credential_cache: Dict[str, Tuple[str, Principal, float]] = {}
|
self._credential_cache: Dict[str, Tuple[str, Principal, float]] = {}
|
||||||
self._cache_ttl = 60.0 # Cache credentials for 60 seconds
|
self._cache_ttl = 60.0
|
||||||
self._last_stat_check = 0.0
|
self._last_stat_check = 0.0
|
||||||
self._stat_check_interval = 1.0 # Only stat() file every 1 second
|
self._stat_check_interval = 1.0
|
||||||
self._sessions: Dict[str, Dict[str, Any]] = {}
|
self._sessions: Dict[str, Dict[str, Any]] = {}
|
||||||
self._load()
|
self._load()
|
||||||
|
|
||||||
def _maybe_reload(self) -> None:
|
def _maybe_reload(self) -> None:
|
||||||
"""Reload configuration if the file has changed on disk."""
|
"""Reload configuration if the file has changed on disk."""
|
||||||
# Performance: Skip stat check if we checked recently
|
|
||||||
now = time.time()
|
now = time.time()
|
||||||
if now - self._last_stat_check < self._stat_check_interval:
|
if now - self._last_stat_check < self._stat_check_interval:
|
||||||
return
|
return
|
||||||
@@ -133,7 +123,7 @@ class IamService:
|
|||||||
try:
|
try:
|
||||||
if self.config_path.stat().st_mtime > self._last_load_time:
|
if self.config_path.stat().st_mtime > self._last_load_time:
|
||||||
self._load()
|
self._load()
|
||||||
self._credential_cache.clear() # Invalidate cache on reload
|
self._credential_cache.clear()
|
||||||
except OSError:
|
except OSError:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
@@ -227,7 +217,6 @@ class IamService:
|
|||||||
del self._sessions[token]
|
del self._sessions[token]
|
||||||
|
|
||||||
def principal_for_key(self, access_key: str) -> Principal:
|
def principal_for_key(self, access_key: str) -> Principal:
|
||||||
# Performance: Check cache first
|
|
||||||
now = time.time()
|
now = time.time()
|
||||||
cached = self._credential_cache.get(access_key)
|
cached = self._credential_cache.get(access_key)
|
||||||
if cached:
|
if cached:
|
||||||
@@ -244,7 +233,6 @@ class IamService:
|
|||||||
return principal
|
return principal
|
||||||
|
|
||||||
def secret_for_key(self, access_key: str) -> str:
|
def secret_for_key(self, access_key: str) -> str:
|
||||||
# Performance: Check cache first
|
|
||||||
now = time.time()
|
now = time.time()
|
||||||
cached = self._credential_cache.get(access_key)
|
cached = self._credential_cache.get(access_key)
|
||||||
if cached:
|
if cached:
|
||||||
@@ -508,7 +496,6 @@ class IamService:
|
|||||||
raise IamError("User not found")
|
raise IamError("User not found")
|
||||||
|
|
||||||
def get_secret_key(self, access_key: str) -> str | None:
|
def get_secret_key(self, access_key: str) -> str | None:
|
||||||
# Performance: Check cache first
|
|
||||||
now = time.time()
|
now = time.time()
|
||||||
cached = self._credential_cache.get(access_key)
|
cached = self._credential_cache.get(access_key)
|
||||||
if cached:
|
if cached:
|
||||||
@@ -519,14 +506,12 @@ class IamService:
|
|||||||
self._maybe_reload()
|
self._maybe_reload()
|
||||||
record = self._users.get(access_key)
|
record = self._users.get(access_key)
|
||||||
if record:
|
if record:
|
||||||
# Cache the result
|
|
||||||
principal = self._build_principal(access_key, record)
|
principal = self._build_principal(access_key, record)
|
||||||
self._credential_cache[access_key] = (record["secret_key"], principal, now)
|
self._credential_cache[access_key] = (record["secret_key"], principal, now)
|
||||||
return record["secret_key"]
|
return record["secret_key"]
|
||||||
return None
|
return None
|
||||||
|
|
||||||
def get_principal(self, access_key: str) -> Principal | None:
|
def get_principal(self, access_key: str) -> Principal | None:
|
||||||
# Performance: Check cache first
|
|
||||||
now = time.time()
|
now = time.time()
|
||||||
cached = self._credential_cache.get(access_key)
|
cached = self._credential_cache.get(access_key)
|
||||||
if cached:
|
if cached:
|
||||||
|
|||||||
104
app/lifecycle.py
104
app/lifecycle.py
@@ -1,5 +1,6 @@
|
|||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import json
|
||||||
import logging
|
import logging
|
||||||
import threading
|
import threading
|
||||||
import time
|
import time
|
||||||
@@ -23,13 +24,104 @@ class LifecycleResult:
|
|||||||
execution_time_seconds: float = 0.0
|
execution_time_seconds: float = 0.0
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class LifecycleExecutionRecord:
|
||||||
|
timestamp: float
|
||||||
|
bucket_name: str
|
||||||
|
objects_deleted: int
|
||||||
|
versions_deleted: int
|
||||||
|
uploads_aborted: int
|
||||||
|
errors: List[str]
|
||||||
|
execution_time_seconds: float
|
||||||
|
|
||||||
|
def to_dict(self) -> dict:
|
||||||
|
return {
|
||||||
|
"timestamp": self.timestamp,
|
||||||
|
"bucket_name": self.bucket_name,
|
||||||
|
"objects_deleted": self.objects_deleted,
|
||||||
|
"versions_deleted": self.versions_deleted,
|
||||||
|
"uploads_aborted": self.uploads_aborted,
|
||||||
|
"errors": self.errors,
|
||||||
|
"execution_time_seconds": self.execution_time_seconds,
|
||||||
|
}
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_dict(cls, data: dict) -> "LifecycleExecutionRecord":
|
||||||
|
return cls(
|
||||||
|
timestamp=data["timestamp"],
|
||||||
|
bucket_name=data["bucket_name"],
|
||||||
|
objects_deleted=data["objects_deleted"],
|
||||||
|
versions_deleted=data["versions_deleted"],
|
||||||
|
uploads_aborted=data["uploads_aborted"],
|
||||||
|
errors=data.get("errors", []),
|
||||||
|
execution_time_seconds=data["execution_time_seconds"],
|
||||||
|
)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_result(cls, result: LifecycleResult) -> "LifecycleExecutionRecord":
|
||||||
|
return cls(
|
||||||
|
timestamp=time.time(),
|
||||||
|
bucket_name=result.bucket_name,
|
||||||
|
objects_deleted=result.objects_deleted,
|
||||||
|
versions_deleted=result.versions_deleted,
|
||||||
|
uploads_aborted=result.uploads_aborted,
|
||||||
|
errors=result.errors.copy(),
|
||||||
|
execution_time_seconds=result.execution_time_seconds,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class LifecycleHistoryStore:
|
||||||
|
MAX_HISTORY_PER_BUCKET = 50
|
||||||
|
|
||||||
|
def __init__(self, storage_root: Path) -> None:
|
||||||
|
self.storage_root = storage_root
|
||||||
|
self._lock = threading.Lock()
|
||||||
|
|
||||||
|
def _get_history_path(self, bucket_name: str) -> Path:
|
||||||
|
return self.storage_root / ".myfsio.sys" / "buckets" / bucket_name / "lifecycle_history.json"
|
||||||
|
|
||||||
|
def load_history(self, bucket_name: str) -> List[LifecycleExecutionRecord]:
|
||||||
|
path = self._get_history_path(bucket_name)
|
||||||
|
if not path.exists():
|
||||||
|
return []
|
||||||
|
try:
|
||||||
|
with open(path, "r") as f:
|
||||||
|
data = json.load(f)
|
||||||
|
return [LifecycleExecutionRecord.from_dict(d) for d in data.get("executions", [])]
|
||||||
|
except (OSError, ValueError, KeyError) as e:
|
||||||
|
logger.error(f"Failed to load lifecycle history for {bucket_name}: {e}")
|
||||||
|
return []
|
||||||
|
|
||||||
|
def save_history(self, bucket_name: str, records: List[LifecycleExecutionRecord]) -> None:
|
||||||
|
path = self._get_history_path(bucket_name)
|
||||||
|
path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
data = {"executions": [r.to_dict() for r in records[:self.MAX_HISTORY_PER_BUCKET]]}
|
||||||
|
try:
|
||||||
|
with open(path, "w") as f:
|
||||||
|
json.dump(data, f, indent=2)
|
||||||
|
except OSError as e:
|
||||||
|
logger.error(f"Failed to save lifecycle history for {bucket_name}: {e}")
|
||||||
|
|
||||||
|
def add_record(self, bucket_name: str, record: LifecycleExecutionRecord) -> None:
|
||||||
|
with self._lock:
|
||||||
|
records = self.load_history(bucket_name)
|
||||||
|
records.insert(0, record)
|
||||||
|
self.save_history(bucket_name, records)
|
||||||
|
|
||||||
|
def get_history(self, bucket_name: str, limit: int = 50, offset: int = 0) -> List[LifecycleExecutionRecord]:
|
||||||
|
records = self.load_history(bucket_name)
|
||||||
|
return records[offset:offset + limit]
|
||||||
|
|
||||||
|
|
||||||
class LifecycleManager:
|
class LifecycleManager:
|
||||||
def __init__(self, storage: ObjectStorage, interval_seconds: int = 3600):
|
def __init__(self, storage: ObjectStorage, interval_seconds: int = 3600, storage_root: Optional[Path] = None):
|
||||||
self.storage = storage
|
self.storage = storage
|
||||||
self.interval_seconds = interval_seconds
|
self.interval_seconds = interval_seconds
|
||||||
|
self.storage_root = storage_root
|
||||||
self._timer: Optional[threading.Timer] = None
|
self._timer: Optional[threading.Timer] = None
|
||||||
self._shutdown = False
|
self._shutdown = False
|
||||||
self._lock = threading.Lock()
|
self._lock = threading.Lock()
|
||||||
|
self.history_store = LifecycleHistoryStore(storage_root) if storage_root else None
|
||||||
|
|
||||||
def start(self) -> None:
|
def start(self) -> None:
|
||||||
if self._timer is not None:
|
if self._timer is not None:
|
||||||
@@ -98,12 +190,15 @@ class LifecycleManager:
|
|||||||
logger.error(f"Lifecycle enforcement error for {bucket_name}: {e}")
|
logger.error(f"Lifecycle enforcement error for {bucket_name}: {e}")
|
||||||
|
|
||||||
result.execution_time_seconds = time.time() - start_time
|
result.execution_time_seconds = time.time() - start_time
|
||||||
if result.objects_deleted > 0 or result.versions_deleted > 0 or result.uploads_aborted > 0:
|
if result.objects_deleted > 0 or result.versions_deleted > 0 or result.uploads_aborted > 0 or result.errors:
|
||||||
logger.info(
|
logger.info(
|
||||||
f"Lifecycle enforcement for {bucket_name}: "
|
f"Lifecycle enforcement for {bucket_name}: "
|
||||||
f"deleted={result.objects_deleted}, versions={result.versions_deleted}, "
|
f"deleted={result.objects_deleted}, versions={result.versions_deleted}, "
|
||||||
f"aborted={result.uploads_aborted}, time={result.execution_time_seconds:.2f}s"
|
f"aborted={result.uploads_aborted}, time={result.execution_time_seconds:.2f}s"
|
||||||
)
|
)
|
||||||
|
if self.history_store:
|
||||||
|
record = LifecycleExecutionRecord.from_result(result)
|
||||||
|
self.history_store.add_record(bucket_name, record)
|
||||||
return result
|
return result
|
||||||
|
|
||||||
def _enforce_expiration(
|
def _enforce_expiration(
|
||||||
@@ -233,3 +328,8 @@ class LifecycleManager:
|
|||||||
if bucket_name:
|
if bucket_name:
|
||||||
return {bucket_name: self.enforce_rules(bucket_name)}
|
return {bucket_name: self.enforce_rules(bucket_name)}
|
||||||
return self.enforce_all_buckets()
|
return self.enforce_all_buckets()
|
||||||
|
|
||||||
|
def get_execution_history(self, bucket_name: str, limit: int = 50, offset: int = 0) -> List[LifecycleExecutionRecord]:
|
||||||
|
if not self.history_store:
|
||||||
|
return []
|
||||||
|
return self.history_store.get_history(bucket_name, limit, offset)
|
||||||
|
|||||||
@@ -8,7 +8,7 @@ import time
|
|||||||
from concurrent.futures import ThreadPoolExecutor
|
from concurrent.futures import ThreadPoolExecutor
|
||||||
from dataclasses import dataclass, field
|
from dataclasses import dataclass, field
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Any, Dict, Optional
|
from typing import Any, Dict, List, Optional
|
||||||
|
|
||||||
import boto3
|
import boto3
|
||||||
from botocore.config import Config
|
from botocore.config import Config
|
||||||
@@ -23,7 +23,7 @@ logger = logging.getLogger(__name__)
|
|||||||
REPLICATION_USER_AGENT = "S3ReplicationAgent/1.0"
|
REPLICATION_USER_AGENT = "S3ReplicationAgent/1.0"
|
||||||
REPLICATION_CONNECT_TIMEOUT = 5
|
REPLICATION_CONNECT_TIMEOUT = 5
|
||||||
REPLICATION_READ_TIMEOUT = 30
|
REPLICATION_READ_TIMEOUT = 30
|
||||||
STREAMING_THRESHOLD_BYTES = 10 * 1024 * 1024 # 10 MiB - use streaming for larger files
|
STREAMING_THRESHOLD_BYTES = 10 * 1024 * 1024
|
||||||
|
|
||||||
REPLICATION_MODE_NEW_ONLY = "new_only"
|
REPLICATION_MODE_NEW_ONLY = "new_only"
|
||||||
REPLICATION_MODE_ALL = "all"
|
REPLICATION_MODE_ALL = "all"
|
||||||
@@ -87,6 +87,40 @@ class ReplicationStats:
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class ReplicationFailure:
|
||||||
|
object_key: str
|
||||||
|
error_message: str
|
||||||
|
timestamp: float
|
||||||
|
failure_count: int
|
||||||
|
bucket_name: str
|
||||||
|
action: str
|
||||||
|
last_error_code: Optional[str] = None
|
||||||
|
|
||||||
|
def to_dict(self) -> dict:
|
||||||
|
return {
|
||||||
|
"object_key": self.object_key,
|
||||||
|
"error_message": self.error_message,
|
||||||
|
"timestamp": self.timestamp,
|
||||||
|
"failure_count": self.failure_count,
|
||||||
|
"bucket_name": self.bucket_name,
|
||||||
|
"action": self.action,
|
||||||
|
"last_error_code": self.last_error_code,
|
||||||
|
}
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_dict(cls, data: dict) -> "ReplicationFailure":
|
||||||
|
return cls(
|
||||||
|
object_key=data["object_key"],
|
||||||
|
error_message=data["error_message"],
|
||||||
|
timestamp=data["timestamp"],
|
||||||
|
failure_count=data["failure_count"],
|
||||||
|
bucket_name=data["bucket_name"],
|
||||||
|
action=data["action"],
|
||||||
|
last_error_code=data.get("last_error_code"),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
@dataclass
|
@dataclass
|
||||||
class ReplicationRule:
|
class ReplicationRule:
|
||||||
bucket_name: str
|
bucket_name: str
|
||||||
@@ -120,15 +154,86 @@ class ReplicationRule:
|
|||||||
return rule
|
return rule
|
||||||
|
|
||||||
|
|
||||||
|
class ReplicationFailureStore:
|
||||||
|
MAX_FAILURES_PER_BUCKET = 50
|
||||||
|
|
||||||
|
def __init__(self, storage_root: Path) -> None:
|
||||||
|
self.storage_root = storage_root
|
||||||
|
self._lock = threading.Lock()
|
||||||
|
|
||||||
|
def _get_failures_path(self, bucket_name: str) -> Path:
|
||||||
|
return self.storage_root / ".myfsio.sys" / "buckets" / bucket_name / "replication_failures.json"
|
||||||
|
|
||||||
|
def load_failures(self, bucket_name: str) -> List[ReplicationFailure]:
|
||||||
|
path = self._get_failures_path(bucket_name)
|
||||||
|
if not path.exists():
|
||||||
|
return []
|
||||||
|
try:
|
||||||
|
with open(path, "r") as f:
|
||||||
|
data = json.load(f)
|
||||||
|
return [ReplicationFailure.from_dict(d) for d in data.get("failures", [])]
|
||||||
|
except (OSError, ValueError, KeyError) as e:
|
||||||
|
logger.error(f"Failed to load replication failures for {bucket_name}: {e}")
|
||||||
|
return []
|
||||||
|
|
||||||
|
def save_failures(self, bucket_name: str, failures: List[ReplicationFailure]) -> None:
|
||||||
|
path = self._get_failures_path(bucket_name)
|
||||||
|
path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
data = {"failures": [f.to_dict() for f in failures[:self.MAX_FAILURES_PER_BUCKET]]}
|
||||||
|
try:
|
||||||
|
with open(path, "w") as f:
|
||||||
|
json.dump(data, f, indent=2)
|
||||||
|
except OSError as e:
|
||||||
|
logger.error(f"Failed to save replication failures for {bucket_name}: {e}")
|
||||||
|
|
||||||
|
def add_failure(self, bucket_name: str, failure: ReplicationFailure) -> None:
|
||||||
|
with self._lock:
|
||||||
|
failures = self.load_failures(bucket_name)
|
||||||
|
existing = next((f for f in failures if f.object_key == failure.object_key), None)
|
||||||
|
if existing:
|
||||||
|
existing.failure_count += 1
|
||||||
|
existing.timestamp = failure.timestamp
|
||||||
|
existing.error_message = failure.error_message
|
||||||
|
existing.last_error_code = failure.last_error_code
|
||||||
|
else:
|
||||||
|
failures.insert(0, failure)
|
||||||
|
self.save_failures(bucket_name, failures)
|
||||||
|
|
||||||
|
def remove_failure(self, bucket_name: str, object_key: str) -> bool:
|
||||||
|
with self._lock:
|
||||||
|
failures = self.load_failures(bucket_name)
|
||||||
|
original_len = len(failures)
|
||||||
|
failures = [f for f in failures if f.object_key != object_key]
|
||||||
|
if len(failures) < original_len:
|
||||||
|
self.save_failures(bucket_name, failures)
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
|
def clear_failures(self, bucket_name: str) -> None:
|
||||||
|
with self._lock:
|
||||||
|
path = self._get_failures_path(bucket_name)
|
||||||
|
if path.exists():
|
||||||
|
path.unlink()
|
||||||
|
|
||||||
|
def get_failure(self, bucket_name: str, object_key: str) -> Optional[ReplicationFailure]:
|
||||||
|
failures = self.load_failures(bucket_name)
|
||||||
|
return next((f for f in failures if f.object_key == object_key), None)
|
||||||
|
|
||||||
|
def get_failure_count(self, bucket_name: str) -> int:
|
||||||
|
return len(self.load_failures(bucket_name))
|
||||||
|
|
||||||
|
|
||||||
class ReplicationManager:
|
class ReplicationManager:
|
||||||
def __init__(self, storage: ObjectStorage, connections: ConnectionStore, rules_path: Path) -> None:
|
def __init__(self, storage: ObjectStorage, connections: ConnectionStore, rules_path: Path, storage_root: Path) -> None:
|
||||||
self.storage = storage
|
self.storage = storage
|
||||||
self.connections = connections
|
self.connections = connections
|
||||||
self.rules_path = rules_path
|
self.rules_path = rules_path
|
||||||
|
self.storage_root = storage_root
|
||||||
self._rules: Dict[str, ReplicationRule] = {}
|
self._rules: Dict[str, ReplicationRule] = {}
|
||||||
self._stats_lock = threading.Lock()
|
self._stats_lock = threading.Lock()
|
||||||
self._executor = ThreadPoolExecutor(max_workers=4, thread_name_prefix="ReplicationWorker")
|
self._executor = ThreadPoolExecutor(max_workers=4, thread_name_prefix="ReplicationWorker")
|
||||||
self._shutdown = False
|
self._shutdown = False
|
||||||
|
self.failure_store = ReplicationFailureStore(storage_root)
|
||||||
self.reload_rules()
|
self.reload_rules()
|
||||||
|
|
||||||
def shutdown(self, wait: bool = True) -> None:
|
def shutdown(self, wait: bool = True) -> None:
|
||||||
@@ -307,7 +412,6 @@ class ReplicationManager:
|
|||||||
if self._shutdown:
|
if self._shutdown:
|
||||||
return
|
return
|
||||||
|
|
||||||
# Re-check if rule is still enabled (may have been paused after task was submitted)
|
|
||||||
current_rule = self.get_rule(bucket_name)
|
current_rule = self.get_rule(bucket_name)
|
||||||
if not current_rule or not current_rule.enabled:
|
if not current_rule or not current_rule.enabled:
|
||||||
logger.debug(f"Replication skipped for {bucket_name}/{object_key}: rule disabled or removed")
|
logger.debug(f"Replication skipped for {bucket_name}/{object_key}: rule disabled or removed")
|
||||||
@@ -332,8 +436,19 @@ class ReplicationManager:
|
|||||||
s3.delete_object(Bucket=rule.target_bucket, Key=object_key)
|
s3.delete_object(Bucket=rule.target_bucket, Key=object_key)
|
||||||
logger.info(f"Replicated DELETE {bucket_name}/{object_key} to {conn.name} ({rule.target_bucket})")
|
logger.info(f"Replicated DELETE {bucket_name}/{object_key} to {conn.name} ({rule.target_bucket})")
|
||||||
self._update_last_sync(bucket_name, object_key)
|
self._update_last_sync(bucket_name, object_key)
|
||||||
|
self.failure_store.remove_failure(bucket_name, object_key)
|
||||||
except ClientError as e:
|
except ClientError as e:
|
||||||
|
error_code = e.response.get('Error', {}).get('Code')
|
||||||
logger.error(f"Replication DELETE failed for {bucket_name}/{object_key}: {e}")
|
logger.error(f"Replication DELETE failed for {bucket_name}/{object_key}: {e}")
|
||||||
|
self.failure_store.add_failure(bucket_name, ReplicationFailure(
|
||||||
|
object_key=object_key,
|
||||||
|
error_message=str(e),
|
||||||
|
timestamp=time.time(),
|
||||||
|
failure_count=1,
|
||||||
|
bucket_name=bucket_name,
|
||||||
|
action="delete",
|
||||||
|
last_error_code=error_code,
|
||||||
|
))
|
||||||
return
|
return
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@@ -358,7 +473,6 @@ class ReplicationManager:
|
|||||||
extra_args["ContentType"] = content_type
|
extra_args["ContentType"] = content_type
|
||||||
|
|
||||||
if file_size >= STREAMING_THRESHOLD_BYTES:
|
if file_size >= STREAMING_THRESHOLD_BYTES:
|
||||||
# Use multipart upload for large files
|
|
||||||
s3.upload_file(
|
s3.upload_file(
|
||||||
str(path),
|
str(path),
|
||||||
rule.target_bucket,
|
rule.target_bucket,
|
||||||
@@ -366,7 +480,6 @@ class ReplicationManager:
|
|||||||
ExtraArgs=extra_args if extra_args else None,
|
ExtraArgs=extra_args if extra_args else None,
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
# Read small files into memory
|
|
||||||
file_content = path.read_bytes()
|
file_content = path.read_bytes()
|
||||||
put_kwargs = {
|
put_kwargs = {
|
||||||
"Bucket": rule.target_bucket,
|
"Bucket": rule.target_bucket,
|
||||||
@@ -408,9 +521,89 @@ class ReplicationManager:
|
|||||||
|
|
||||||
logger.info(f"Replicated {bucket_name}/{object_key} to {conn.name} ({rule.target_bucket})")
|
logger.info(f"Replicated {bucket_name}/{object_key} to {conn.name} ({rule.target_bucket})")
|
||||||
self._update_last_sync(bucket_name, object_key)
|
self._update_last_sync(bucket_name, object_key)
|
||||||
|
self.failure_store.remove_failure(bucket_name, object_key)
|
||||||
|
|
||||||
except (ClientError, OSError, ValueError) as e:
|
except (ClientError, OSError, ValueError) as e:
|
||||||
|
error_code = None
|
||||||
|
if isinstance(e, ClientError):
|
||||||
|
error_code = e.response.get('Error', {}).get('Code')
|
||||||
logger.error(f"Replication failed for {bucket_name}/{object_key}: {e}")
|
logger.error(f"Replication failed for {bucket_name}/{object_key}: {e}")
|
||||||
except Exception:
|
self.failure_store.add_failure(bucket_name, ReplicationFailure(
|
||||||
|
object_key=object_key,
|
||||||
|
error_message=str(e),
|
||||||
|
timestamp=time.time(),
|
||||||
|
failure_count=1,
|
||||||
|
bucket_name=bucket_name,
|
||||||
|
action=action,
|
||||||
|
last_error_code=error_code,
|
||||||
|
))
|
||||||
|
except Exception as e:
|
||||||
logger.exception(f"Unexpected error during replication for {bucket_name}/{object_key}")
|
logger.exception(f"Unexpected error during replication for {bucket_name}/{object_key}")
|
||||||
|
self.failure_store.add_failure(bucket_name, ReplicationFailure(
|
||||||
|
object_key=object_key,
|
||||||
|
error_message=str(e),
|
||||||
|
timestamp=time.time(),
|
||||||
|
failure_count=1,
|
||||||
|
bucket_name=bucket_name,
|
||||||
|
action=action,
|
||||||
|
last_error_code=None,
|
||||||
|
))
|
||||||
|
|
||||||
|
def get_failed_items(self, bucket_name: str, limit: int = 50, offset: int = 0) -> List[ReplicationFailure]:
|
||||||
|
failures = self.failure_store.load_failures(bucket_name)
|
||||||
|
return failures[offset:offset + limit]
|
||||||
|
|
||||||
|
def get_failure_count(self, bucket_name: str) -> int:
|
||||||
|
return self.failure_store.get_failure_count(bucket_name)
|
||||||
|
|
||||||
|
def retry_failed_item(self, bucket_name: str, object_key: str) -> bool:
|
||||||
|
failure = self.failure_store.get_failure(bucket_name, object_key)
|
||||||
|
if not failure:
|
||||||
|
return False
|
||||||
|
|
||||||
|
rule = self.get_rule(bucket_name)
|
||||||
|
if not rule or not rule.enabled:
|
||||||
|
return False
|
||||||
|
|
||||||
|
connection = self.connections.get(rule.target_connection_id)
|
||||||
|
if not connection:
|
||||||
|
logger.warning(f"Cannot retry: Connection {rule.target_connection_id} not found")
|
||||||
|
return False
|
||||||
|
|
||||||
|
if not self.check_endpoint_health(connection):
|
||||||
|
logger.warning(f"Cannot retry: Endpoint {connection.name} is not reachable")
|
||||||
|
return False
|
||||||
|
|
||||||
|
self._executor.submit(self._replicate_task, bucket_name, object_key, rule, connection, failure.action)
|
||||||
|
return True
|
||||||
|
|
||||||
|
def retry_all_failed(self, bucket_name: str) -> Dict[str, int]:
|
||||||
|
failures = self.failure_store.load_failures(bucket_name)
|
||||||
|
if not failures:
|
||||||
|
return {"submitted": 0, "skipped": 0}
|
||||||
|
|
||||||
|
rule = self.get_rule(bucket_name)
|
||||||
|
if not rule or not rule.enabled:
|
||||||
|
return {"submitted": 0, "skipped": len(failures)}
|
||||||
|
|
||||||
|
connection = self.connections.get(rule.target_connection_id)
|
||||||
|
if not connection:
|
||||||
|
logger.warning(f"Cannot retry: Connection {rule.target_connection_id} not found")
|
||||||
|
return {"submitted": 0, "skipped": len(failures)}
|
||||||
|
|
||||||
|
if not self.check_endpoint_health(connection):
|
||||||
|
logger.warning(f"Cannot retry: Endpoint {connection.name} is not reachable")
|
||||||
|
return {"submitted": 0, "skipped": len(failures)}
|
||||||
|
|
||||||
|
submitted = 0
|
||||||
|
for failure in failures:
|
||||||
|
self._executor.submit(self._replicate_task, bucket_name, failure.object_key, rule, connection, failure.action)
|
||||||
|
submitted += 1
|
||||||
|
|
||||||
|
return {"submitted": submitted, "skipped": 0}
|
||||||
|
|
||||||
|
def dismiss_failure(self, bucket_name: str, object_key: str) -> bool:
|
||||||
|
return self.failure_store.remove_failure(bucket_name, object_key)
|
||||||
|
|
||||||
|
def clear_failures(self, bucket_name: str) -> None:
|
||||||
|
self.failure_store.clear_failures(bucket_name)
|
||||||
|
|||||||
190
app/s3_api.py
190
app/s3_api.py
@@ -25,7 +25,7 @@ from .iam import IamError, Principal
|
|||||||
from .notifications import NotificationService, NotificationConfiguration, WebhookDestination
|
from .notifications import NotificationService, NotificationConfiguration, WebhookDestination
|
||||||
from .object_lock import ObjectLockService, ObjectLockRetention, ObjectLockConfig, ObjectLockError, RetentionMode
|
from .object_lock import ObjectLockService, ObjectLockRetention, ObjectLockConfig, ObjectLockError, RetentionMode
|
||||||
from .replication import ReplicationManager
|
from .replication import ReplicationManager
|
||||||
from .storage import ObjectStorage, StorageError, QuotaExceededError
|
from .storage import ObjectStorage, StorageError, QuotaExceededError, BucketNotFoundError, ObjectNotFoundError
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
@@ -217,7 +217,6 @@ def _verify_sigv4_header(req: Any, auth_header: str) -> Principal | None:
|
|||||||
calculated_signature = hmac.new(signing_key, string_to_sign.encode("utf-8"), hashlib.sha256).hexdigest()
|
calculated_signature = hmac.new(signing_key, string_to_sign.encode("utf-8"), hashlib.sha256).hexdigest()
|
||||||
|
|
||||||
if not hmac.compare_digest(calculated_signature, signature):
|
if not hmac.compare_digest(calculated_signature, signature):
|
||||||
# Only log detailed signature debug info if DEBUG_SIGV4 is enabled
|
|
||||||
if current_app.config.get("DEBUG_SIGV4"):
|
if current_app.config.get("DEBUG_SIGV4"):
|
||||||
logger.warning(
|
logger.warning(
|
||||||
"SigV4 signature mismatch",
|
"SigV4 signature mismatch",
|
||||||
@@ -260,7 +259,13 @@ def _verify_sigv4_query(req: Any) -> Principal | None:
|
|||||||
raise IamError("Invalid Date format")
|
raise IamError("Invalid Date format")
|
||||||
|
|
||||||
now = datetime.now(timezone.utc)
|
now = datetime.now(timezone.utc)
|
||||||
if now > req_time + timedelta(seconds=int(expires)):
|
try:
|
||||||
|
expires_seconds = int(expires)
|
||||||
|
if expires_seconds <= 0:
|
||||||
|
raise IamError("Invalid Expires value: must be positive")
|
||||||
|
except ValueError:
|
||||||
|
raise IamError("Invalid Expires value: must be an integer")
|
||||||
|
if now > req_time + timedelta(seconds=expires_seconds):
|
||||||
raise IamError("Request expired")
|
raise IamError("Request expired")
|
||||||
|
|
||||||
secret_key = _iam().get_secret_key(access_key)
|
secret_key = _iam().get_secret_key(access_key)
|
||||||
@@ -916,6 +921,7 @@ def _maybe_handle_bucket_subresource(bucket_name: str) -> Response | None:
|
|||||||
"object-lock": _bucket_object_lock_handler,
|
"object-lock": _bucket_object_lock_handler,
|
||||||
"notification": _bucket_notification_handler,
|
"notification": _bucket_notification_handler,
|
||||||
"logging": _bucket_logging_handler,
|
"logging": _bucket_logging_handler,
|
||||||
|
"uploads": _bucket_uploads_handler,
|
||||||
}
|
}
|
||||||
requested = [key for key in handlers if key in request.args]
|
requested = [key for key in handlers if key in request.args]
|
||||||
if not requested:
|
if not requested:
|
||||||
@@ -1036,21 +1042,23 @@ def _object_tagging_handler(bucket_name: str, object_key: str) -> Response:
|
|||||||
if request.method == "GET":
|
if request.method == "GET":
|
||||||
try:
|
try:
|
||||||
tags = storage.get_object_tags(bucket_name, object_key)
|
tags = storage.get_object_tags(bucket_name, object_key)
|
||||||
|
except BucketNotFoundError as exc:
|
||||||
|
return _error_response("NoSuchBucket", str(exc), 404)
|
||||||
|
except ObjectNotFoundError as exc:
|
||||||
|
return _error_response("NoSuchKey", str(exc), 404)
|
||||||
except StorageError as exc:
|
except StorageError as exc:
|
||||||
message = str(exc)
|
return _error_response("InternalError", str(exc), 500)
|
||||||
if "Bucket" in message:
|
|
||||||
return _error_response("NoSuchBucket", message, 404)
|
|
||||||
return _error_response("NoSuchKey", message, 404)
|
|
||||||
return _xml_response(_render_tagging_document(tags))
|
return _xml_response(_render_tagging_document(tags))
|
||||||
|
|
||||||
if request.method == "DELETE":
|
if request.method == "DELETE":
|
||||||
try:
|
try:
|
||||||
storage.delete_object_tags(bucket_name, object_key)
|
storage.delete_object_tags(bucket_name, object_key)
|
||||||
|
except BucketNotFoundError as exc:
|
||||||
|
return _error_response("NoSuchBucket", str(exc), 404)
|
||||||
|
except ObjectNotFoundError as exc:
|
||||||
|
return _error_response("NoSuchKey", str(exc), 404)
|
||||||
except StorageError as exc:
|
except StorageError as exc:
|
||||||
message = str(exc)
|
return _error_response("InternalError", str(exc), 500)
|
||||||
if "Bucket" in message:
|
|
||||||
return _error_response("NoSuchBucket", message, 404)
|
|
||||||
return _error_response("NoSuchKey", message, 404)
|
|
||||||
current_app.logger.info("Object tags deleted", extra={"bucket": bucket_name, "key": object_key})
|
current_app.logger.info("Object tags deleted", extra={"bucket": bucket_name, "key": object_key})
|
||||||
return Response(status=204)
|
return Response(status=204)
|
||||||
|
|
||||||
@@ -1063,11 +1071,12 @@ def _object_tagging_handler(bucket_name: str, object_key: str) -> Response:
|
|||||||
return _error_response("InvalidTag", "A maximum of 10 tags is supported for objects", 400)
|
return _error_response("InvalidTag", "A maximum of 10 tags is supported for objects", 400)
|
||||||
try:
|
try:
|
||||||
storage.set_object_tags(bucket_name, object_key, tags)
|
storage.set_object_tags(bucket_name, object_key, tags)
|
||||||
|
except BucketNotFoundError as exc:
|
||||||
|
return _error_response("NoSuchBucket", str(exc), 404)
|
||||||
|
except ObjectNotFoundError as exc:
|
||||||
|
return _error_response("NoSuchKey", str(exc), 404)
|
||||||
except StorageError as exc:
|
except StorageError as exc:
|
||||||
message = str(exc)
|
return _error_response("InternalError", str(exc), 500)
|
||||||
if "Bucket" in message:
|
|
||||||
return _error_response("NoSuchBucket", message, 404)
|
|
||||||
return _error_response("NoSuchKey", message, 404)
|
|
||||||
current_app.logger.info("Object tags updated", extra={"bucket": bucket_name, "key": object_key, "tags": len(tags)})
|
current_app.logger.info("Object tags updated", extra={"bucket": bucket_name, "key": object_key, "tags": len(tags)})
|
||||||
return Response(status=204)
|
return Response(status=204)
|
||||||
|
|
||||||
@@ -1283,7 +1292,10 @@ def _bucket_list_versions_handler(bucket_name: str) -> Response:
|
|||||||
|
|
||||||
prefix = request.args.get("prefix", "")
|
prefix = request.args.get("prefix", "")
|
||||||
delimiter = request.args.get("delimiter", "")
|
delimiter = request.args.get("delimiter", "")
|
||||||
max_keys = min(int(request.args.get("max-keys", 1000)), 1000)
|
try:
|
||||||
|
max_keys = max(1, min(int(request.args.get("max-keys", 1000)), 1000))
|
||||||
|
except ValueError:
|
||||||
|
return _error_response("InvalidArgument", "max-keys must be an integer", 400)
|
||||||
key_marker = request.args.get("key-marker", "")
|
key_marker = request.args.get("key-marker", "")
|
||||||
|
|
||||||
if prefix:
|
if prefix:
|
||||||
@@ -1314,7 +1326,8 @@ def _bucket_list_versions_handler(bucket_name: str) -> Response:
|
|||||||
SubElement(version, "VersionId").text = "null"
|
SubElement(version, "VersionId").text = "null"
|
||||||
SubElement(version, "IsLatest").text = "true"
|
SubElement(version, "IsLatest").text = "true"
|
||||||
SubElement(version, "LastModified").text = obj.last_modified.strftime("%Y-%m-%dT%H:%M:%S.000Z")
|
SubElement(version, "LastModified").text = obj.last_modified.strftime("%Y-%m-%dT%H:%M:%S.000Z")
|
||||||
SubElement(version, "ETag").text = f'"{obj.etag}"'
|
if obj.etag:
|
||||||
|
SubElement(version, "ETag").text = f'"{obj.etag}"'
|
||||||
SubElement(version, "Size").text = str(obj.size)
|
SubElement(version, "Size").text = str(obj.size)
|
||||||
SubElement(version, "StorageClass").text = "STANDARD"
|
SubElement(version, "StorageClass").text = "STANDARD"
|
||||||
|
|
||||||
@@ -1475,7 +1488,10 @@ def _parse_lifecycle_config(payload: bytes) -> list:
|
|||||||
expiration: dict = {}
|
expiration: dict = {}
|
||||||
days_el = exp_el.find("{*}Days") or exp_el.find("Days")
|
days_el = exp_el.find("{*}Days") or exp_el.find("Days")
|
||||||
if days_el is not None and days_el.text:
|
if days_el is not None and days_el.text:
|
||||||
expiration["Days"] = int(days_el.text.strip())
|
days_val = int(days_el.text.strip())
|
||||||
|
if days_val <= 0:
|
||||||
|
raise ValueError("Expiration Days must be a positive integer")
|
||||||
|
expiration["Days"] = days_val
|
||||||
date_el = exp_el.find("{*}Date") or exp_el.find("Date")
|
date_el = exp_el.find("{*}Date") or exp_el.find("Date")
|
||||||
if date_el is not None and date_el.text:
|
if date_el is not None and date_el.text:
|
||||||
expiration["Date"] = date_el.text.strip()
|
expiration["Date"] = date_el.text.strip()
|
||||||
@@ -1490,7 +1506,10 @@ def _parse_lifecycle_config(payload: bytes) -> list:
|
|||||||
nve: dict = {}
|
nve: dict = {}
|
||||||
days_el = nve_el.find("{*}NoncurrentDays") or nve_el.find("NoncurrentDays")
|
days_el = nve_el.find("{*}NoncurrentDays") or nve_el.find("NoncurrentDays")
|
||||||
if days_el is not None and days_el.text:
|
if days_el is not None and days_el.text:
|
||||||
nve["NoncurrentDays"] = int(days_el.text.strip())
|
noncurrent_days = int(days_el.text.strip())
|
||||||
|
if noncurrent_days <= 0:
|
||||||
|
raise ValueError("NoncurrentDays must be a positive integer")
|
||||||
|
nve["NoncurrentDays"] = noncurrent_days
|
||||||
if nve:
|
if nve:
|
||||||
rule["NoncurrentVersionExpiration"] = nve
|
rule["NoncurrentVersionExpiration"] = nve
|
||||||
|
|
||||||
@@ -1499,7 +1518,10 @@ def _parse_lifecycle_config(payload: bytes) -> list:
|
|||||||
aimu: dict = {}
|
aimu: dict = {}
|
||||||
days_el = aimu_el.find("{*}DaysAfterInitiation") or aimu_el.find("DaysAfterInitiation")
|
days_el = aimu_el.find("{*}DaysAfterInitiation") or aimu_el.find("DaysAfterInitiation")
|
||||||
if days_el is not None and days_el.text:
|
if days_el is not None and days_el.text:
|
||||||
aimu["DaysAfterInitiation"] = int(days_el.text.strip())
|
days_after = int(days_el.text.strip())
|
||||||
|
if days_after <= 0:
|
||||||
|
raise ValueError("DaysAfterInitiation must be a positive integer")
|
||||||
|
aimu["DaysAfterInitiation"] = days_after
|
||||||
if aimu:
|
if aimu:
|
||||||
rule["AbortIncompleteMultipartUpload"] = aimu
|
rule["AbortIncompleteMultipartUpload"] = aimu
|
||||||
|
|
||||||
@@ -1792,6 +1814,72 @@ def _bucket_logging_handler(bucket_name: str) -> Response:
|
|||||||
return Response(status=200)
|
return Response(status=200)
|
||||||
|
|
||||||
|
|
||||||
|
def _bucket_uploads_handler(bucket_name: str) -> Response:
|
||||||
|
if request.method != "GET":
|
||||||
|
return _method_not_allowed(["GET"])
|
||||||
|
|
||||||
|
principal, error = _require_principal()
|
||||||
|
if error:
|
||||||
|
return error
|
||||||
|
try:
|
||||||
|
_authorize_action(principal, bucket_name, "list")
|
||||||
|
except IamError as exc:
|
||||||
|
return _error_response("AccessDenied", str(exc), 403)
|
||||||
|
|
||||||
|
storage = _storage()
|
||||||
|
if not storage.bucket_exists(bucket_name):
|
||||||
|
return _error_response("NoSuchBucket", "Bucket does not exist", 404)
|
||||||
|
|
||||||
|
key_marker = request.args.get("key-marker", "")
|
||||||
|
upload_id_marker = request.args.get("upload-id-marker", "")
|
||||||
|
prefix = request.args.get("prefix", "")
|
||||||
|
delimiter = request.args.get("delimiter", "")
|
||||||
|
try:
|
||||||
|
max_uploads = max(1, min(int(request.args.get("max-uploads", 1000)), 1000))
|
||||||
|
except ValueError:
|
||||||
|
return _error_response("InvalidArgument", "max-uploads must be an integer", 400)
|
||||||
|
|
||||||
|
uploads = storage.list_multipart_uploads(bucket_name, include_orphaned=True)
|
||||||
|
|
||||||
|
if prefix:
|
||||||
|
uploads = [u for u in uploads if u["object_key"].startswith(prefix)]
|
||||||
|
if key_marker:
|
||||||
|
uploads = [u for u in uploads if u["object_key"] > key_marker or
|
||||||
|
(u["object_key"] == key_marker and upload_id_marker and u["upload_id"] > upload_id_marker)]
|
||||||
|
|
||||||
|
uploads.sort(key=lambda u: (u["object_key"], u["upload_id"]))
|
||||||
|
|
||||||
|
is_truncated = len(uploads) > max_uploads
|
||||||
|
if is_truncated:
|
||||||
|
uploads = uploads[:max_uploads]
|
||||||
|
|
||||||
|
root = Element("ListMultipartUploadsResult", xmlns="http://s3.amazonaws.com/doc/2006-03-01/")
|
||||||
|
SubElement(root, "Bucket").text = bucket_name
|
||||||
|
SubElement(root, "KeyMarker").text = key_marker
|
||||||
|
SubElement(root, "UploadIdMarker").text = upload_id_marker
|
||||||
|
if prefix:
|
||||||
|
SubElement(root, "Prefix").text = prefix
|
||||||
|
if delimiter:
|
||||||
|
SubElement(root, "Delimiter").text = delimiter
|
||||||
|
SubElement(root, "MaxUploads").text = str(max_uploads)
|
||||||
|
SubElement(root, "IsTruncated").text = "true" if is_truncated else "false"
|
||||||
|
|
||||||
|
if is_truncated and uploads:
|
||||||
|
SubElement(root, "NextKeyMarker").text = uploads[-1]["object_key"]
|
||||||
|
SubElement(root, "NextUploadIdMarker").text = uploads[-1]["upload_id"]
|
||||||
|
|
||||||
|
for upload in uploads:
|
||||||
|
upload_el = SubElement(root, "Upload")
|
||||||
|
SubElement(upload_el, "Key").text = upload["object_key"]
|
||||||
|
SubElement(upload_el, "UploadId").text = upload["upload_id"]
|
||||||
|
if upload.get("created_at"):
|
||||||
|
SubElement(upload_el, "Initiated").text = upload["created_at"]
|
||||||
|
if upload.get("orphaned"):
|
||||||
|
SubElement(upload_el, "StorageClass").text = "ORPHANED"
|
||||||
|
|
||||||
|
return _xml_response(root)
|
||||||
|
|
||||||
|
|
||||||
def _object_retention_handler(bucket_name: str, object_key: str) -> Response:
|
def _object_retention_handler(bucket_name: str, object_key: str) -> Response:
|
||||||
if request.method not in {"GET", "PUT"}:
|
if request.method not in {"GET", "PUT"}:
|
||||||
return _method_not_allowed(["GET", "PUT"])
|
return _method_not_allowed(["GET", "PUT"])
|
||||||
@@ -2085,7 +2173,10 @@ def bucket_handler(bucket_name: str) -> Response:
|
|||||||
list_type = request.args.get("list-type")
|
list_type = request.args.get("list-type")
|
||||||
prefix = request.args.get("prefix", "")
|
prefix = request.args.get("prefix", "")
|
||||||
delimiter = request.args.get("delimiter", "")
|
delimiter = request.args.get("delimiter", "")
|
||||||
max_keys = min(int(request.args.get("max-keys", current_app.config["UI_PAGE_SIZE"])), 1000)
|
try:
|
||||||
|
max_keys = max(1, min(int(request.args.get("max-keys", current_app.config["UI_PAGE_SIZE"])), 1000))
|
||||||
|
except ValueError:
|
||||||
|
return _error_response("InvalidArgument", "max-keys must be an integer", 400)
|
||||||
|
|
||||||
marker = request.args.get("marker", "") # ListObjects v1
|
marker = request.args.get("marker", "") # ListObjects v1
|
||||||
continuation_token = request.args.get("continuation-token", "") # ListObjectsV2
|
continuation_token = request.args.get("continuation-token", "") # ListObjectsV2
|
||||||
@@ -2098,7 +2189,7 @@ def bucket_handler(bucket_name: str) -> Response:
|
|||||||
if continuation_token:
|
if continuation_token:
|
||||||
try:
|
try:
|
||||||
effective_start = base64.urlsafe_b64decode(continuation_token.encode()).decode("utf-8")
|
effective_start = base64.urlsafe_b64decode(continuation_token.encode()).decode("utf-8")
|
||||||
except Exception:
|
except (ValueError, UnicodeDecodeError):
|
||||||
effective_start = continuation_token
|
effective_start = continuation_token
|
||||||
elif start_after:
|
elif start_after:
|
||||||
effective_start = start_after
|
effective_start = start_after
|
||||||
@@ -2178,10 +2269,11 @@ def bucket_handler(bucket_name: str) -> Response:
|
|||||||
obj_el = SubElement(root, "Contents")
|
obj_el = SubElement(root, "Contents")
|
||||||
SubElement(obj_el, "Key").text = meta.key
|
SubElement(obj_el, "Key").text = meta.key
|
||||||
SubElement(obj_el, "LastModified").text = meta.last_modified.isoformat()
|
SubElement(obj_el, "LastModified").text = meta.last_modified.isoformat()
|
||||||
SubElement(obj_el, "ETag").text = f'"{meta.etag}"'
|
if meta.etag:
|
||||||
|
SubElement(obj_el, "ETag").text = f'"{meta.etag}"'
|
||||||
SubElement(obj_el, "Size").text = str(meta.size)
|
SubElement(obj_el, "Size").text = str(meta.size)
|
||||||
SubElement(obj_el, "StorageClass").text = "STANDARD"
|
SubElement(obj_el, "StorageClass").text = "STANDARD"
|
||||||
|
|
||||||
for cp in common_prefixes:
|
for cp in common_prefixes:
|
||||||
cp_el = SubElement(root, "CommonPrefixes")
|
cp_el = SubElement(root, "CommonPrefixes")
|
||||||
SubElement(cp_el, "Prefix").text = cp
|
SubElement(cp_el, "Prefix").text = cp
|
||||||
@@ -2194,15 +2286,16 @@ def bucket_handler(bucket_name: str) -> Response:
|
|||||||
SubElement(root, "IsTruncated").text = "true" if is_truncated else "false"
|
SubElement(root, "IsTruncated").text = "true" if is_truncated else "false"
|
||||||
if delimiter:
|
if delimiter:
|
||||||
SubElement(root, "Delimiter").text = delimiter
|
SubElement(root, "Delimiter").text = delimiter
|
||||||
|
|
||||||
if is_truncated and delimiter and next_marker:
|
if is_truncated and delimiter and next_marker:
|
||||||
SubElement(root, "NextMarker").text = next_marker
|
SubElement(root, "NextMarker").text = next_marker
|
||||||
|
|
||||||
for meta in objects:
|
for meta in objects:
|
||||||
obj_el = SubElement(root, "Contents")
|
obj_el = SubElement(root, "Contents")
|
||||||
SubElement(obj_el, "Key").text = meta.key
|
SubElement(obj_el, "Key").text = meta.key
|
||||||
SubElement(obj_el, "LastModified").text = meta.last_modified.isoformat()
|
SubElement(obj_el, "LastModified").text = meta.last_modified.isoformat()
|
||||||
SubElement(obj_el, "ETag").text = f'"{meta.etag}"'
|
if meta.etag:
|
||||||
|
SubElement(obj_el, "ETag").text = f'"{meta.etag}"'
|
||||||
SubElement(obj_el, "Size").text = str(meta.size)
|
SubElement(obj_el, "Size").text = str(meta.size)
|
||||||
|
|
||||||
for cp in common_prefixes:
|
for cp in common_prefixes:
|
||||||
@@ -2282,7 +2375,8 @@ def object_handler(bucket_name: str, object_key: str):
|
|||||||
extra={"bucket": bucket_name, "key": object_key, "size": meta.size},
|
extra={"bucket": bucket_name, "key": object_key, "size": meta.size},
|
||||||
)
|
)
|
||||||
response = Response(status=200)
|
response = Response(status=200)
|
||||||
response.headers["ETag"] = f'"{meta.etag}"'
|
if meta.etag:
|
||||||
|
response.headers["ETag"] = f'"{meta.etag}"'
|
||||||
|
|
||||||
_notifications().emit_object_created(
|
_notifications().emit_object_created(
|
||||||
bucket_name,
|
bucket_name,
|
||||||
@@ -2725,7 +2819,8 @@ def _copy_object(dest_bucket: str, dest_key: str, copy_source: str) -> Response:
|
|||||||
|
|
||||||
root = Element("CopyObjectResult")
|
root = Element("CopyObjectResult")
|
||||||
SubElement(root, "LastModified").text = meta.last_modified.isoformat()
|
SubElement(root, "LastModified").text = meta.last_modified.isoformat()
|
||||||
SubElement(root, "ETag").text = f'"{meta.etag}"'
|
if meta.etag:
|
||||||
|
SubElement(root, "ETag").text = f'"{meta.etag}"'
|
||||||
return _xml_response(root)
|
return _xml_response(root)
|
||||||
|
|
||||||
|
|
||||||
@@ -2737,7 +2832,7 @@ class AwsChunkedDecoder:
|
|||||||
|
|
||||||
def __init__(self, stream):
|
def __init__(self, stream):
|
||||||
self.stream = stream
|
self.stream = stream
|
||||||
self._read_buffer = bytearray() # Performance: Pre-allocated buffer
|
self._read_buffer = bytearray()
|
||||||
self.chunk_remaining = 0
|
self.chunk_remaining = 0
|
||||||
self.finished = False
|
self.finished = False
|
||||||
|
|
||||||
@@ -2748,20 +2843,15 @@ class AwsChunkedDecoder:
|
|||||||
"""
|
"""
|
||||||
line = bytearray()
|
line = bytearray()
|
||||||
while True:
|
while True:
|
||||||
# Check if we have data in buffer
|
|
||||||
if self._read_buffer:
|
if self._read_buffer:
|
||||||
# Look for CRLF in buffer
|
|
||||||
idx = self._read_buffer.find(b"\r\n")
|
idx = self._read_buffer.find(b"\r\n")
|
||||||
if idx != -1:
|
if idx != -1:
|
||||||
# Found CRLF - extract line and update buffer
|
|
||||||
line.extend(self._read_buffer[: idx + 2])
|
line.extend(self._read_buffer[: idx + 2])
|
||||||
del self._read_buffer[: idx + 2]
|
del self._read_buffer[: idx + 2]
|
||||||
return bytes(line)
|
return bytes(line)
|
||||||
# No CRLF yet - consume entire buffer
|
|
||||||
line.extend(self._read_buffer)
|
line.extend(self._read_buffer)
|
||||||
self._read_buffer.clear()
|
self._read_buffer.clear()
|
||||||
|
|
||||||
# Read more data in larger chunks (64 bytes is enough for chunk headers)
|
|
||||||
chunk = self.stream.read(64)
|
chunk = self.stream.read(64)
|
||||||
if not chunk:
|
if not chunk:
|
||||||
return bytes(line) if line else b""
|
return bytes(line) if line else b""
|
||||||
@@ -2770,14 +2860,11 @@ class AwsChunkedDecoder:
|
|||||||
def _read_exact(self, n: int) -> bytes:
|
def _read_exact(self, n: int) -> bytes:
|
||||||
"""Read exactly n bytes, using buffer first."""
|
"""Read exactly n bytes, using buffer first."""
|
||||||
result = bytearray()
|
result = bytearray()
|
||||||
# Use buffered data first
|
|
||||||
if self._read_buffer:
|
if self._read_buffer:
|
||||||
take = min(len(self._read_buffer), n)
|
take = min(len(self._read_buffer), n)
|
||||||
result.extend(self._read_buffer[:take])
|
result.extend(self._read_buffer[:take])
|
||||||
del self._read_buffer[:take]
|
del self._read_buffer[:take]
|
||||||
n -= take
|
n -= take
|
||||||
|
|
||||||
# Read remaining directly from stream
|
|
||||||
if n > 0:
|
if n > 0:
|
||||||
data = self.stream.read(n)
|
data = self.stream.read(n)
|
||||||
if data:
|
if data:
|
||||||
@@ -2789,7 +2876,7 @@ class AwsChunkedDecoder:
|
|||||||
if self.finished:
|
if self.finished:
|
||||||
return b""
|
return b""
|
||||||
|
|
||||||
result = bytearray() # Performance: Use bytearray for building result
|
result = bytearray()
|
||||||
while size == -1 or len(result) < size:
|
while size == -1 or len(result) < size:
|
||||||
if self.chunk_remaining > 0:
|
if self.chunk_remaining > 0:
|
||||||
to_read = self.chunk_remaining
|
to_read = self.chunk_remaining
|
||||||
@@ -2823,7 +2910,6 @@ class AwsChunkedDecoder:
|
|||||||
|
|
||||||
if chunk_size == 0:
|
if chunk_size == 0:
|
||||||
self.finished = True
|
self.finished = True
|
||||||
# Skip trailing headers
|
|
||||||
while True:
|
while True:
|
||||||
trailer = self._read_line()
|
trailer = self._read_line()
|
||||||
if trailer == b"\r\n" or not trailer:
|
if trailer == b"\r\n" or not trailer:
|
||||||
@@ -2947,8 +3033,9 @@ def _complete_multipart_upload(bucket_name: str, object_key: str) -> Response:
|
|||||||
SubElement(root, "Location").text = location
|
SubElement(root, "Location").text = location
|
||||||
SubElement(root, "Bucket").text = bucket_name
|
SubElement(root, "Bucket").text = bucket_name
|
||||||
SubElement(root, "Key").text = object_key
|
SubElement(root, "Key").text = object_key
|
||||||
SubElement(root, "ETag").text = f'"{meta.etag}"'
|
if meta.etag:
|
||||||
|
SubElement(root, "ETag").text = f'"{meta.etag}"'
|
||||||
|
|
||||||
return _xml_response(root)
|
return _xml_response(root)
|
||||||
|
|
||||||
|
|
||||||
@@ -2963,10 +3050,11 @@ def _abort_multipart_upload(bucket_name: str, object_key: str) -> Response:
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
_storage().abort_multipart_upload(bucket_name, upload_id)
|
_storage().abort_multipart_upload(bucket_name, upload_id)
|
||||||
|
except BucketNotFoundError as exc:
|
||||||
|
return _error_response("NoSuchBucket", str(exc), 404)
|
||||||
except StorageError as exc:
|
except StorageError as exc:
|
||||||
if "Bucket does not exist" in str(exc):
|
current_app.logger.warning(f"Error aborting multipart upload: {exc}")
|
||||||
return _error_response("NoSuchBucket", str(exc), 404)
|
|
||||||
|
|
||||||
return Response(status=204)
|
return Response(status=204)
|
||||||
|
|
||||||
|
|
||||||
@@ -2978,13 +3066,15 @@ def resolve_principal():
|
|||||||
(request.args.get("X-Amz-Algorithm") == "AWS4-HMAC-SHA256"):
|
(request.args.get("X-Amz-Algorithm") == "AWS4-HMAC-SHA256"):
|
||||||
g.principal = _verify_sigv4(request)
|
g.principal = _verify_sigv4(request)
|
||||||
return
|
return
|
||||||
except Exception:
|
except IamError as exc:
|
||||||
pass
|
logger.debug(f"SigV4 authentication failed: {exc}")
|
||||||
|
except (ValueError, KeyError) as exc:
|
||||||
|
logger.debug(f"SigV4 parsing error: {exc}")
|
||||||
|
|
||||||
access_key = request.headers.get("X-Access-Key")
|
access_key = request.headers.get("X-Access-Key")
|
||||||
secret_key = request.headers.get("X-Secret-Key")
|
secret_key = request.headers.get("X-Secret-Key")
|
||||||
if access_key and secret_key:
|
if access_key and secret_key:
|
||||||
try:
|
try:
|
||||||
g.principal = _iam().authenticate(access_key, secret_key)
|
g.principal = _iam().authenticate(access_key, secret_key)
|
||||||
except Exception:
|
except IamError as exc:
|
||||||
pass
|
logger.debug(f"Header authentication failed: {exc}")
|
||||||
|
|||||||
228
app/storage.py
228
app/storage.py
@@ -76,6 +76,14 @@ class StorageError(RuntimeError):
|
|||||||
"""Raised when the storage layer encounters an unrecoverable problem."""
|
"""Raised when the storage layer encounters an unrecoverable problem."""
|
||||||
|
|
||||||
|
|
||||||
|
class BucketNotFoundError(StorageError):
|
||||||
|
"""Raised when the bucket does not exist."""
|
||||||
|
|
||||||
|
|
||||||
|
class ObjectNotFoundError(StorageError):
|
||||||
|
"""Raised when the object does not exist."""
|
||||||
|
|
||||||
|
|
||||||
class QuotaExceededError(StorageError):
|
class QuotaExceededError(StorageError):
|
||||||
"""Raised when an operation would exceed bucket quota limits."""
|
"""Raised when an operation would exceed bucket quota limits."""
|
||||||
|
|
||||||
@@ -90,7 +98,7 @@ class ObjectMeta:
|
|||||||
key: str
|
key: str
|
||||||
size: int
|
size: int
|
||||||
last_modified: datetime
|
last_modified: datetime
|
||||||
etag: str
|
etag: Optional[str] = None
|
||||||
metadata: Optional[Dict[str, str]] = None
|
metadata: Optional[Dict[str, str]] = None
|
||||||
|
|
||||||
|
|
||||||
@@ -106,7 +114,7 @@ class ListObjectsResult:
|
|||||||
objects: List[ObjectMeta]
|
objects: List[ObjectMeta]
|
||||||
is_truncated: bool
|
is_truncated: bool
|
||||||
next_continuation_token: Optional[str]
|
next_continuation_token: Optional[str]
|
||||||
total_count: Optional[int] = None # Total objects in bucket (from stats cache)
|
total_count: Optional[int] = None
|
||||||
|
|
||||||
|
|
||||||
def _utcnow() -> datetime:
|
def _utcnow() -> datetime:
|
||||||
@@ -129,23 +137,20 @@ class ObjectStorage:
|
|||||||
BUCKET_VERSIONS_DIR = "versions"
|
BUCKET_VERSIONS_DIR = "versions"
|
||||||
MULTIPART_MANIFEST = "manifest.json"
|
MULTIPART_MANIFEST = "manifest.json"
|
||||||
BUCKET_CONFIG_FILE = ".bucket.json"
|
BUCKET_CONFIG_FILE = ".bucket.json"
|
||||||
KEY_INDEX_CACHE_TTL = 30
|
DEFAULT_CACHE_TTL = 5
|
||||||
OBJECT_CACHE_MAX_SIZE = 100 # Maximum number of buckets to cache
|
OBJECT_CACHE_MAX_SIZE = 100
|
||||||
|
|
||||||
def __init__(self, root: Path) -> None:
|
def __init__(self, root: Path, cache_ttl: int = DEFAULT_CACHE_TTL) -> None:
|
||||||
self.root = Path(root)
|
self.root = Path(root)
|
||||||
self.root.mkdir(parents=True, exist_ok=True)
|
self.root.mkdir(parents=True, exist_ok=True)
|
||||||
self._ensure_system_roots()
|
self._ensure_system_roots()
|
||||||
# LRU cache for object metadata with thread-safe access
|
|
||||||
self._object_cache: OrderedDict[str, tuple[Dict[str, ObjectMeta], float]] = OrderedDict()
|
self._object_cache: OrderedDict[str, tuple[Dict[str, ObjectMeta], float]] = OrderedDict()
|
||||||
self._cache_lock = threading.Lock() # Global lock for cache structure
|
self._cache_lock = threading.Lock()
|
||||||
# Performance: Per-bucket locks to reduce contention
|
|
||||||
self._bucket_locks: Dict[str, threading.Lock] = {}
|
self._bucket_locks: Dict[str, threading.Lock] = {}
|
||||||
# Cache version counter for detecting stale reads
|
|
||||||
self._cache_version: Dict[str, int] = {}
|
self._cache_version: Dict[str, int] = {}
|
||||||
# Performance: Bucket config cache with TTL
|
|
||||||
self._bucket_config_cache: Dict[str, tuple[dict[str, Any], float]] = {}
|
self._bucket_config_cache: Dict[str, tuple[dict[str, Any], float]] = {}
|
||||||
self._bucket_config_cache_ttl = 30.0 # 30 second TTL
|
self._bucket_config_cache_ttl = 30.0
|
||||||
|
self._cache_ttl = cache_ttl
|
||||||
|
|
||||||
def _get_bucket_lock(self, bucket_id: str) -> threading.Lock:
|
def _get_bucket_lock(self, bucket_id: str) -> threading.Lock:
|
||||||
"""Get or create a lock for a specific bucket. Reduces global lock contention."""
|
"""Get or create a lock for a specific bucket. Reduces global lock contention."""
|
||||||
@@ -170,6 +175,11 @@ class ObjectStorage:
|
|||||||
def bucket_exists(self, bucket_name: str) -> bool:
|
def bucket_exists(self, bucket_name: str) -> bool:
|
||||||
return self._bucket_path(bucket_name).exists()
|
return self._bucket_path(bucket_name).exists()
|
||||||
|
|
||||||
|
def _require_bucket_exists(self, bucket_path: Path) -> None:
|
||||||
|
"""Raise BucketNotFoundError if bucket does not exist."""
|
||||||
|
if not bucket_path.exists():
|
||||||
|
raise BucketNotFoundError("Bucket does not exist")
|
||||||
|
|
||||||
def _validate_bucket_name(self, bucket_name: str) -> None:
|
def _validate_bucket_name(self, bucket_name: str) -> None:
|
||||||
if len(bucket_name) < 3 or len(bucket_name) > 63:
|
if len(bucket_name) < 3 or len(bucket_name) > 63:
|
||||||
raise StorageError("Bucket name must be between 3 and 63 characters")
|
raise StorageError("Bucket name must be between 3 and 63 characters")
|
||||||
@@ -188,14 +198,14 @@ class ObjectStorage:
|
|||||||
|
|
||||||
def bucket_stats(self, bucket_name: str, cache_ttl: int = 60) -> dict[str, int]:
|
def bucket_stats(self, bucket_name: str, cache_ttl: int = 60) -> dict[str, int]:
|
||||||
"""Return object count and total size for the bucket (cached).
|
"""Return object count and total size for the bucket (cached).
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
bucket_name: Name of the bucket
|
bucket_name: Name of the bucket
|
||||||
cache_ttl: Cache time-to-live in seconds (default 60)
|
cache_ttl: Cache time-to-live in seconds (default 60)
|
||||||
"""
|
"""
|
||||||
bucket_path = self._bucket_path(bucket_name)
|
bucket_path = self._bucket_path(bucket_name)
|
||||||
if not bucket_path.exists():
|
if not bucket_path.exists():
|
||||||
raise StorageError("Bucket does not exist")
|
raise BucketNotFoundError("Bucket does not exist")
|
||||||
|
|
||||||
cache_path = self._system_bucket_root(bucket_name) / "stats.json"
|
cache_path = self._system_bucket_root(bucket_name) / "stats.json"
|
||||||
if cache_path.exists():
|
if cache_path.exists():
|
||||||
@@ -257,8 +267,7 @@ class ObjectStorage:
|
|||||||
def delete_bucket(self, bucket_name: str) -> None:
|
def delete_bucket(self, bucket_name: str) -> None:
|
||||||
bucket_path = self._bucket_path(bucket_name)
|
bucket_path = self._bucket_path(bucket_name)
|
||||||
if not bucket_path.exists():
|
if not bucket_path.exists():
|
||||||
raise StorageError("Bucket does not exist")
|
raise BucketNotFoundError("Bucket does not exist")
|
||||||
# Performance: Single check instead of three separate traversals
|
|
||||||
has_objects, has_versions, has_multipart = self._check_bucket_contents(bucket_path)
|
has_objects, has_versions, has_multipart = self._check_bucket_contents(bucket_path)
|
||||||
if has_objects:
|
if has_objects:
|
||||||
raise StorageError("Bucket not empty")
|
raise StorageError("Bucket not empty")
|
||||||
@@ -291,7 +300,7 @@ class ObjectStorage:
|
|||||||
"""
|
"""
|
||||||
bucket_path = self._bucket_path(bucket_name)
|
bucket_path = self._bucket_path(bucket_name)
|
||||||
if not bucket_path.exists():
|
if not bucket_path.exists():
|
||||||
raise StorageError("Bucket does not exist")
|
raise BucketNotFoundError("Bucket does not exist")
|
||||||
bucket_id = bucket_path.name
|
bucket_id = bucket_path.name
|
||||||
|
|
||||||
object_cache = self._get_object_cache(bucket_id, bucket_path)
|
object_cache = self._get_object_cache(bucket_id, bucket_path)
|
||||||
@@ -352,7 +361,7 @@ class ObjectStorage:
|
|||||||
) -> ObjectMeta:
|
) -> ObjectMeta:
|
||||||
bucket_path = self._bucket_path(bucket_name)
|
bucket_path = self._bucket_path(bucket_name)
|
||||||
if not bucket_path.exists():
|
if not bucket_path.exists():
|
||||||
raise StorageError("Bucket does not exist")
|
raise BucketNotFoundError("Bucket does not exist")
|
||||||
bucket_id = bucket_path.name
|
bucket_id = bucket_path.name
|
||||||
|
|
||||||
safe_key = self._sanitize_object_key(object_key)
|
safe_key = self._sanitize_object_key(object_key)
|
||||||
@@ -409,7 +418,6 @@ class ObjectStorage:
|
|||||||
|
|
||||||
self._invalidate_bucket_stats_cache(bucket_id)
|
self._invalidate_bucket_stats_cache(bucket_id)
|
||||||
|
|
||||||
# Performance: Lazy update - only update the affected key instead of invalidating whole cache
|
|
||||||
obj_meta = ObjectMeta(
|
obj_meta = ObjectMeta(
|
||||||
key=safe_key.as_posix(),
|
key=safe_key.as_posix(),
|
||||||
size=stat.st_size,
|
size=stat.st_size,
|
||||||
@@ -424,7 +432,7 @@ class ObjectStorage:
|
|||||||
def get_object_path(self, bucket_name: str, object_key: str) -> Path:
|
def get_object_path(self, bucket_name: str, object_key: str) -> Path:
|
||||||
path = self._object_path(bucket_name, object_key)
|
path = self._object_path(bucket_name, object_key)
|
||||||
if not path.exists():
|
if not path.exists():
|
||||||
raise StorageError("Object not found")
|
raise ObjectNotFoundError("Object not found")
|
||||||
return path
|
return path
|
||||||
|
|
||||||
def get_object_metadata(self, bucket_name: str, object_key: str) -> Dict[str, str]:
|
def get_object_metadata(self, bucket_name: str, object_key: str) -> Dict[str, str]:
|
||||||
@@ -467,7 +475,6 @@ class ObjectStorage:
|
|||||||
self._delete_metadata(bucket_id, rel)
|
self._delete_metadata(bucket_id, rel)
|
||||||
|
|
||||||
self._invalidate_bucket_stats_cache(bucket_id)
|
self._invalidate_bucket_stats_cache(bucket_id)
|
||||||
# Performance: Lazy update - only remove the affected key instead of invalidating whole cache
|
|
||||||
self._update_object_cache_entry(bucket_id, safe_key.as_posix(), None)
|
self._update_object_cache_entry(bucket_id, safe_key.as_posix(), None)
|
||||||
self._cleanup_empty_parents(path, bucket_path)
|
self._cleanup_empty_parents(path, bucket_path)
|
||||||
|
|
||||||
@@ -490,14 +497,13 @@ class ObjectStorage:
|
|||||||
shutil.rmtree(legacy_version_dir, ignore_errors=True)
|
shutil.rmtree(legacy_version_dir, ignore_errors=True)
|
||||||
|
|
||||||
self._invalidate_bucket_stats_cache(bucket_id)
|
self._invalidate_bucket_stats_cache(bucket_id)
|
||||||
# Performance: Lazy update - only remove the affected key instead of invalidating whole cache
|
|
||||||
self._update_object_cache_entry(bucket_id, rel.as_posix(), None)
|
self._update_object_cache_entry(bucket_id, rel.as_posix(), None)
|
||||||
self._cleanup_empty_parents(target, bucket_path)
|
self._cleanup_empty_parents(target, bucket_path)
|
||||||
|
|
||||||
def is_versioning_enabled(self, bucket_name: str) -> bool:
|
def is_versioning_enabled(self, bucket_name: str) -> bool:
|
||||||
bucket_path = self._bucket_path(bucket_name)
|
bucket_path = self._bucket_path(bucket_name)
|
||||||
if not bucket_path.exists():
|
if not bucket_path.exists():
|
||||||
raise StorageError("Bucket does not exist")
|
raise BucketNotFoundError("Bucket does not exist")
|
||||||
return self._is_versioning_enabled(bucket_path)
|
return self._is_versioning_enabled(bucket_path)
|
||||||
|
|
||||||
def set_bucket_versioning(self, bucket_name: str, enabled: bool) -> None:
|
def set_bucket_versioning(self, bucket_name: str, enabled: bool) -> None:
|
||||||
@@ -689,11 +695,11 @@ class ObjectStorage:
|
|||||||
"""Get tags for an object."""
|
"""Get tags for an object."""
|
||||||
bucket_path = self._bucket_path(bucket_name)
|
bucket_path = self._bucket_path(bucket_name)
|
||||||
if not bucket_path.exists():
|
if not bucket_path.exists():
|
||||||
raise StorageError("Bucket does not exist")
|
raise BucketNotFoundError("Bucket does not exist")
|
||||||
safe_key = self._sanitize_object_key(object_key)
|
safe_key = self._sanitize_object_key(object_key)
|
||||||
object_path = bucket_path / safe_key
|
object_path = bucket_path / safe_key
|
||||||
if not object_path.exists():
|
if not object_path.exists():
|
||||||
raise StorageError("Object does not exist")
|
raise ObjectNotFoundError("Object does not exist")
|
||||||
|
|
||||||
for meta_file in (self._metadata_file(bucket_path.name, safe_key), self._legacy_metadata_file(bucket_path.name, safe_key)):
|
for meta_file in (self._metadata_file(bucket_path.name, safe_key), self._legacy_metadata_file(bucket_path.name, safe_key)):
|
||||||
if not meta_file.exists():
|
if not meta_file.exists():
|
||||||
@@ -712,11 +718,11 @@ class ObjectStorage:
|
|||||||
"""Set tags for an object."""
|
"""Set tags for an object."""
|
||||||
bucket_path = self._bucket_path(bucket_name)
|
bucket_path = self._bucket_path(bucket_name)
|
||||||
if not bucket_path.exists():
|
if not bucket_path.exists():
|
||||||
raise StorageError("Bucket does not exist")
|
raise BucketNotFoundError("Bucket does not exist")
|
||||||
safe_key = self._sanitize_object_key(object_key)
|
safe_key = self._sanitize_object_key(object_key)
|
||||||
object_path = bucket_path / safe_key
|
object_path = bucket_path / safe_key
|
||||||
if not object_path.exists():
|
if not object_path.exists():
|
||||||
raise StorageError("Object does not exist")
|
raise ObjectNotFoundError("Object does not exist")
|
||||||
|
|
||||||
meta_file = self._metadata_file(bucket_path.name, safe_key)
|
meta_file = self._metadata_file(bucket_path.name, safe_key)
|
||||||
|
|
||||||
@@ -750,7 +756,7 @@ class ObjectStorage:
|
|||||||
def list_object_versions(self, bucket_name: str, object_key: str) -> List[Dict[str, Any]]:
|
def list_object_versions(self, bucket_name: str, object_key: str) -> List[Dict[str, Any]]:
|
||||||
bucket_path = self._bucket_path(bucket_name)
|
bucket_path = self._bucket_path(bucket_name)
|
||||||
if not bucket_path.exists():
|
if not bucket_path.exists():
|
||||||
raise StorageError("Bucket does not exist")
|
raise BucketNotFoundError("Bucket does not exist")
|
||||||
bucket_id = bucket_path.name
|
bucket_id = bucket_path.name
|
||||||
safe_key = self._sanitize_object_key(object_key)
|
safe_key = self._sanitize_object_key(object_key)
|
||||||
version_dir = self._version_dir(bucket_id, safe_key)
|
version_dir = self._version_dir(bucket_id, safe_key)
|
||||||
@@ -774,7 +780,7 @@ class ObjectStorage:
|
|||||||
def restore_object_version(self, bucket_name: str, object_key: str, version_id: str) -> ObjectMeta:
|
def restore_object_version(self, bucket_name: str, object_key: str, version_id: str) -> ObjectMeta:
|
||||||
bucket_path = self._bucket_path(bucket_name)
|
bucket_path = self._bucket_path(bucket_name)
|
||||||
if not bucket_path.exists():
|
if not bucket_path.exists():
|
||||||
raise StorageError("Bucket does not exist")
|
raise BucketNotFoundError("Bucket does not exist")
|
||||||
bucket_id = bucket_path.name
|
bucket_id = bucket_path.name
|
||||||
safe_key = self._sanitize_object_key(object_key)
|
safe_key = self._sanitize_object_key(object_key)
|
||||||
version_dir = self._version_dir(bucket_id, safe_key)
|
version_dir = self._version_dir(bucket_id, safe_key)
|
||||||
@@ -811,7 +817,7 @@ class ObjectStorage:
|
|||||||
def delete_object_version(self, bucket_name: str, object_key: str, version_id: str) -> None:
|
def delete_object_version(self, bucket_name: str, object_key: str, version_id: str) -> None:
|
||||||
bucket_path = self._bucket_path(bucket_name)
|
bucket_path = self._bucket_path(bucket_name)
|
||||||
if not bucket_path.exists():
|
if not bucket_path.exists():
|
||||||
raise StorageError("Bucket does not exist")
|
raise BucketNotFoundError("Bucket does not exist")
|
||||||
bucket_id = bucket_path.name
|
bucket_id = bucket_path.name
|
||||||
safe_key = self._sanitize_object_key(object_key)
|
safe_key = self._sanitize_object_key(object_key)
|
||||||
version_dir = self._version_dir(bucket_id, safe_key)
|
version_dir = self._version_dir(bucket_id, safe_key)
|
||||||
@@ -834,7 +840,7 @@ class ObjectStorage:
|
|||||||
def list_orphaned_objects(self, bucket_name: str) -> List[Dict[str, Any]]:
|
def list_orphaned_objects(self, bucket_name: str) -> List[Dict[str, Any]]:
|
||||||
bucket_path = self._bucket_path(bucket_name)
|
bucket_path = self._bucket_path(bucket_name)
|
||||||
if not bucket_path.exists():
|
if not bucket_path.exists():
|
||||||
raise StorageError("Bucket does not exist")
|
raise BucketNotFoundError("Bucket does not exist")
|
||||||
bucket_id = bucket_path.name
|
bucket_id = bucket_path.name
|
||||||
version_roots = [self._bucket_versions_root(bucket_id), self._legacy_versions_root(bucket_id)]
|
version_roots = [self._bucket_versions_root(bucket_id), self._legacy_versions_root(bucket_id)]
|
||||||
if not any(root.exists() for root in version_roots):
|
if not any(root.exists() for root in version_roots):
|
||||||
@@ -902,7 +908,7 @@ class ObjectStorage:
|
|||||||
) -> str:
|
) -> str:
|
||||||
bucket_path = self._bucket_path(bucket_name)
|
bucket_path = self._bucket_path(bucket_name)
|
||||||
if not bucket_path.exists():
|
if not bucket_path.exists():
|
||||||
raise StorageError("Bucket does not exist")
|
raise BucketNotFoundError("Bucket does not exist")
|
||||||
bucket_id = bucket_path.name
|
bucket_id = bucket_path.name
|
||||||
safe_key = self._sanitize_object_key(object_key)
|
safe_key = self._sanitize_object_key(object_key)
|
||||||
upload_id = uuid.uuid4().hex
|
upload_id = uuid.uuid4().hex
|
||||||
@@ -929,8 +935,8 @@ class ObjectStorage:
|
|||||||
|
|
||||||
Uses file locking to safely update the manifest and handle concurrent uploads.
|
Uses file locking to safely update the manifest and handle concurrent uploads.
|
||||||
"""
|
"""
|
||||||
if part_number < 1:
|
if part_number < 1 or part_number > 10000:
|
||||||
raise StorageError("part_number must be >= 1")
|
raise StorageError("part_number must be between 1 and 10000")
|
||||||
bucket_path = self._bucket_path(bucket_name)
|
bucket_path = self._bucket_path(bucket_name)
|
||||||
|
|
||||||
upload_root = self._multipart_dir(bucket_path.name, upload_id)
|
upload_root = self._multipart_dir(bucket_path.name, upload_id)
|
||||||
@@ -939,7 +945,6 @@ class ObjectStorage:
|
|||||||
if not upload_root.exists():
|
if not upload_root.exists():
|
||||||
raise StorageError("Multipart upload not found")
|
raise StorageError("Multipart upload not found")
|
||||||
|
|
||||||
# Write part to temporary file first, then rename atomically
|
|
||||||
checksum = hashlib.md5()
|
checksum = hashlib.md5()
|
||||||
part_filename = f"part-{part_number:05d}.part"
|
part_filename = f"part-{part_number:05d}.part"
|
||||||
part_path = upload_root / part_filename
|
part_path = upload_root / part_filename
|
||||||
@@ -948,11 +953,8 @@ class ObjectStorage:
|
|||||||
try:
|
try:
|
||||||
with temp_path.open("wb") as target:
|
with temp_path.open("wb") as target:
|
||||||
shutil.copyfileobj(_HashingReader(stream, checksum), target)
|
shutil.copyfileobj(_HashingReader(stream, checksum), target)
|
||||||
|
|
||||||
# Atomic rename (or replace on Windows)
|
|
||||||
temp_path.replace(part_path)
|
temp_path.replace(part_path)
|
||||||
except OSError:
|
except OSError:
|
||||||
# Clean up temp file on failure
|
|
||||||
try:
|
try:
|
||||||
temp_path.unlink(missing_ok=True)
|
temp_path.unlink(missing_ok=True)
|
||||||
except OSError:
|
except OSError:
|
||||||
@@ -968,7 +970,6 @@ class ObjectStorage:
|
|||||||
manifest_path = upload_root / self.MULTIPART_MANIFEST
|
manifest_path = upload_root / self.MULTIPART_MANIFEST
|
||||||
lock_path = upload_root / ".manifest.lock"
|
lock_path = upload_root / ".manifest.lock"
|
||||||
|
|
||||||
# Retry loop for handling transient lock/read failures
|
|
||||||
max_retries = 3
|
max_retries = 3
|
||||||
for attempt in range(max_retries):
|
for attempt in range(max_retries):
|
||||||
try:
|
try:
|
||||||
@@ -1079,11 +1080,6 @@ class ObjectStorage:
|
|||||||
checksum.update(data)
|
checksum.update(data)
|
||||||
target.write(data)
|
target.write(data)
|
||||||
|
|
||||||
metadata = manifest.get("metadata")
|
|
||||||
if metadata:
|
|
||||||
self._write_metadata(bucket_id, safe_key, metadata)
|
|
||||||
else:
|
|
||||||
self._delete_metadata(bucket_id, safe_key)
|
|
||||||
except BlockingIOError:
|
except BlockingIOError:
|
||||||
raise StorageError("Another upload to this key is in progress")
|
raise StorageError("Another upload to this key is in progress")
|
||||||
finally:
|
finally:
|
||||||
@@ -1097,12 +1093,18 @@ class ObjectStorage:
|
|||||||
self._invalidate_bucket_stats_cache(bucket_id)
|
self._invalidate_bucket_stats_cache(bucket_id)
|
||||||
|
|
||||||
stat = destination.stat()
|
stat = destination.stat()
|
||||||
# Performance: Lazy update - only update the affected key instead of invalidating whole cache
|
etag = checksum.hexdigest()
|
||||||
|
metadata = manifest.get("metadata")
|
||||||
|
|
||||||
|
internal_meta = {"__etag__": etag, "__size__": str(stat.st_size)}
|
||||||
|
combined_meta = {**internal_meta, **(metadata or {})}
|
||||||
|
self._write_metadata(bucket_id, safe_key, combined_meta)
|
||||||
|
|
||||||
obj_meta = ObjectMeta(
|
obj_meta = ObjectMeta(
|
||||||
key=safe_key.as_posix(),
|
key=safe_key.as_posix(),
|
||||||
size=stat.st_size,
|
size=stat.st_size,
|
||||||
last_modified=datetime.fromtimestamp(stat.st_mtime, timezone.utc),
|
last_modified=datetime.fromtimestamp(stat.st_mtime, timezone.utc),
|
||||||
etag=checksum.hexdigest(),
|
etag=etag,
|
||||||
metadata=metadata,
|
metadata=metadata,
|
||||||
)
|
)
|
||||||
self._update_object_cache_entry(bucket_id, safe_key.as_posix(), obj_meta)
|
self._update_object_cache_entry(bucket_id, safe_key.as_posix(), obj_meta)
|
||||||
@@ -1146,47 +1148,57 @@ class ObjectStorage:
|
|||||||
parts.sort(key=lambda x: x["PartNumber"])
|
parts.sort(key=lambda x: x["PartNumber"])
|
||||||
return parts
|
return parts
|
||||||
|
|
||||||
def list_multipart_uploads(self, bucket_name: str) -> List[Dict[str, Any]]:
|
def list_multipart_uploads(self, bucket_name: str, include_orphaned: bool = False) -> List[Dict[str, Any]]:
|
||||||
"""List all active multipart uploads for a bucket."""
|
"""List all active multipart uploads for a bucket.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
bucket_name: The bucket to list uploads for.
|
||||||
|
include_orphaned: If True, also include upload directories that have
|
||||||
|
files but no valid manifest.json (orphaned/interrupted uploads).
|
||||||
|
"""
|
||||||
bucket_path = self._bucket_path(bucket_name)
|
bucket_path = self._bucket_path(bucket_name)
|
||||||
if not bucket_path.exists():
|
if not bucket_path.exists():
|
||||||
raise StorageError("Bucket does not exist")
|
raise BucketNotFoundError("Bucket does not exist")
|
||||||
bucket_id = bucket_path.name
|
bucket_id = bucket_path.name
|
||||||
uploads = []
|
uploads = []
|
||||||
multipart_root = self._bucket_multipart_root(bucket_id)
|
|
||||||
if multipart_root.exists():
|
for multipart_root in (
|
||||||
|
self._multipart_bucket_root(bucket_id),
|
||||||
|
self._legacy_multipart_bucket_root(bucket_id),
|
||||||
|
):
|
||||||
|
if not multipart_root.exists():
|
||||||
|
continue
|
||||||
for upload_dir in multipart_root.iterdir():
|
for upload_dir in multipart_root.iterdir():
|
||||||
if not upload_dir.is_dir():
|
if not upload_dir.is_dir():
|
||||||
continue
|
continue
|
||||||
manifest_path = upload_dir / "manifest.json"
|
manifest_path = upload_dir / "manifest.json"
|
||||||
if not manifest_path.exists():
|
if manifest_path.exists():
|
||||||
continue
|
try:
|
||||||
try:
|
manifest = json.loads(manifest_path.read_text(encoding="utf-8"))
|
||||||
manifest = json.loads(manifest_path.read_text(encoding="utf-8"))
|
uploads.append({
|
||||||
uploads.append({
|
"upload_id": manifest.get("upload_id", upload_dir.name),
|
||||||
"upload_id": manifest.get("upload_id", upload_dir.name),
|
"object_key": manifest.get("object_key", ""),
|
||||||
"object_key": manifest.get("object_key", ""),
|
"created_at": manifest.get("created_at", ""),
|
||||||
"created_at": manifest.get("created_at", ""),
|
})
|
||||||
})
|
except (OSError, json.JSONDecodeError):
|
||||||
except (OSError, json.JSONDecodeError):
|
if include_orphaned:
|
||||||
continue
|
has_files = any(upload_dir.rglob("*"))
|
||||||
legacy_root = self._legacy_multipart_root(bucket_id)
|
if has_files:
|
||||||
if legacy_root.exists():
|
uploads.append({
|
||||||
for upload_dir in legacy_root.iterdir():
|
"upload_id": upload_dir.name,
|
||||||
if not upload_dir.is_dir():
|
"object_key": "(unknown)",
|
||||||
continue
|
"created_at": "",
|
||||||
manifest_path = upload_dir / "manifest.json"
|
"orphaned": True,
|
||||||
if not manifest_path.exists():
|
})
|
||||||
continue
|
elif include_orphaned:
|
||||||
try:
|
has_files = any(f.is_file() for f in upload_dir.rglob("*"))
|
||||||
manifest = json.loads(manifest_path.read_text(encoding="utf-8"))
|
if has_files:
|
||||||
uploads.append({
|
uploads.append({
|
||||||
"upload_id": manifest.get("upload_id", upload_dir.name),
|
"upload_id": upload_dir.name,
|
||||||
"object_key": manifest.get("object_key", ""),
|
"object_key": "(unknown)",
|
||||||
"created_at": manifest.get("created_at", ""),
|
"created_at": "",
|
||||||
})
|
"orphaned": True,
|
||||||
except (OSError, json.JSONDecodeError):
|
})
|
||||||
continue
|
|
||||||
return uploads
|
return uploads
|
||||||
|
|
||||||
def _bucket_path(self, bucket_name: str) -> Path:
|
def _bucket_path(self, bucket_name: str) -> Path:
|
||||||
@@ -1369,10 +1381,7 @@ class ObjectStorage:
|
|||||||
stat = entry.stat()
|
stat = entry.stat()
|
||||||
|
|
||||||
etag = meta_cache.get(key)
|
etag = meta_cache.get(key)
|
||||||
|
|
||||||
if not etag:
|
|
||||||
etag = f'"{stat.st_size}-{int(stat.st_mtime)}"'
|
|
||||||
|
|
||||||
objects[key] = ObjectMeta(
|
objects[key] = ObjectMeta(
|
||||||
key=key,
|
key=key,
|
||||||
size=stat.st_size,
|
size=stat.st_size,
|
||||||
@@ -1396,38 +1405,30 @@ class ObjectStorage:
|
|||||||
"""
|
"""
|
||||||
now = time.time()
|
now = time.time()
|
||||||
|
|
||||||
# Quick check with global lock (brief)
|
|
||||||
with self._cache_lock:
|
with self._cache_lock:
|
||||||
cached = self._object_cache.get(bucket_id)
|
cached = self._object_cache.get(bucket_id)
|
||||||
if cached:
|
if cached:
|
||||||
objects, timestamp = cached
|
objects, timestamp = cached
|
||||||
if now - timestamp < self.KEY_INDEX_CACHE_TTL:
|
if now - timestamp < self._cache_ttl:
|
||||||
self._object_cache.move_to_end(bucket_id)
|
self._object_cache.move_to_end(bucket_id)
|
||||||
return objects
|
return objects
|
||||||
cache_version = self._cache_version.get(bucket_id, 0)
|
cache_version = self._cache_version.get(bucket_id, 0)
|
||||||
|
|
||||||
# Use per-bucket lock for cache building (allows parallel builds for different buckets)
|
|
||||||
bucket_lock = self._get_bucket_lock(bucket_id)
|
bucket_lock = self._get_bucket_lock(bucket_id)
|
||||||
with bucket_lock:
|
with bucket_lock:
|
||||||
# Double-check cache after acquiring per-bucket lock
|
|
||||||
with self._cache_lock:
|
with self._cache_lock:
|
||||||
cached = self._object_cache.get(bucket_id)
|
cached = self._object_cache.get(bucket_id)
|
||||||
if cached:
|
if cached:
|
||||||
objects, timestamp = cached
|
objects, timestamp = cached
|
||||||
if now - timestamp < self.KEY_INDEX_CACHE_TTL:
|
if now - timestamp < self._cache_ttl:
|
||||||
self._object_cache.move_to_end(bucket_id)
|
self._object_cache.move_to_end(bucket_id)
|
||||||
return objects
|
return objects
|
||||||
|
|
||||||
# Build cache with per-bucket lock held (prevents duplicate work)
|
|
||||||
objects = self._build_object_cache(bucket_path)
|
objects = self._build_object_cache(bucket_path)
|
||||||
|
|
||||||
with self._cache_lock:
|
with self._cache_lock:
|
||||||
# Check if cache was invalidated while we were building
|
|
||||||
current_version = self._cache_version.get(bucket_id, 0)
|
current_version = self._cache_version.get(bucket_id, 0)
|
||||||
if current_version != cache_version:
|
if current_version != cache_version:
|
||||||
objects = self._build_object_cache(bucket_path)
|
objects = self._build_object_cache(bucket_path)
|
||||||
|
|
||||||
# Evict oldest entries if cache is full
|
|
||||||
while len(self._object_cache) >= self.OBJECT_CACHE_MAX_SIZE:
|
while len(self._object_cache) >= self.OBJECT_CACHE_MAX_SIZE:
|
||||||
self._object_cache.popitem(last=False)
|
self._object_cache.popitem(last=False)
|
||||||
|
|
||||||
@@ -1461,12 +1462,39 @@ class ObjectStorage:
|
|||||||
if cached:
|
if cached:
|
||||||
objects, timestamp = cached
|
objects, timestamp = cached
|
||||||
if meta is None:
|
if meta is None:
|
||||||
# Delete operation - remove key from cache
|
|
||||||
objects.pop(key, None)
|
objects.pop(key, None)
|
||||||
else:
|
else:
|
||||||
# Put operation - update/add key in cache
|
|
||||||
objects[key] = meta
|
objects[key] = meta
|
||||||
# Keep same timestamp - don't reset TTL for single key updates
|
|
||||||
|
def warm_cache(self, bucket_names: Optional[List[str]] = None) -> None:
|
||||||
|
"""Pre-warm the object cache for specified buckets or all buckets.
|
||||||
|
|
||||||
|
This is called on startup to ensure the first request is fast.
|
||||||
|
"""
|
||||||
|
if bucket_names is None:
|
||||||
|
bucket_names = [b.name for b in self.list_buckets()]
|
||||||
|
|
||||||
|
for bucket_name in bucket_names:
|
||||||
|
try:
|
||||||
|
bucket_path = self._bucket_path(bucket_name)
|
||||||
|
if bucket_path.exists():
|
||||||
|
self._get_object_cache(bucket_path.name, bucket_path)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
def warm_cache_async(self, bucket_names: Optional[List[str]] = None) -> threading.Thread:
|
||||||
|
"""Start cache warming in a background thread.
|
||||||
|
|
||||||
|
Returns the thread object so caller can optionally wait for it.
|
||||||
|
"""
|
||||||
|
thread = threading.Thread(
|
||||||
|
target=self.warm_cache,
|
||||||
|
args=(bucket_names,),
|
||||||
|
daemon=True,
|
||||||
|
name="cache-warmer",
|
||||||
|
)
|
||||||
|
thread.start()
|
||||||
|
return thread
|
||||||
|
|
||||||
def _ensure_system_roots(self) -> None:
|
def _ensure_system_roots(self) -> None:
|
||||||
for path in (
|
for path in (
|
||||||
@@ -1487,13 +1515,12 @@ class ObjectStorage:
|
|||||||
return self._system_bucket_root(bucket_name) / self.BUCKET_CONFIG_FILE
|
return self._system_bucket_root(bucket_name) / self.BUCKET_CONFIG_FILE
|
||||||
|
|
||||||
def _read_bucket_config(self, bucket_name: str) -> dict[str, Any]:
|
def _read_bucket_config(self, bucket_name: str) -> dict[str, Any]:
|
||||||
# Performance: Check cache first
|
|
||||||
now = time.time()
|
now = time.time()
|
||||||
cached = self._bucket_config_cache.get(bucket_name)
|
cached = self._bucket_config_cache.get(bucket_name)
|
||||||
if cached:
|
if cached:
|
||||||
config, cached_time = cached
|
config, cached_time = cached
|
||||||
if now - cached_time < self._bucket_config_cache_ttl:
|
if now - cached_time < self._bucket_config_cache_ttl:
|
||||||
return config.copy() # Return copy to prevent mutation
|
return config.copy()
|
||||||
|
|
||||||
config_path = self._bucket_config_path(bucket_name)
|
config_path = self._bucket_config_path(bucket_name)
|
||||||
if not config_path.exists():
|
if not config_path.exists():
|
||||||
@@ -1512,7 +1539,6 @@ class ObjectStorage:
|
|||||||
config_path = self._bucket_config_path(bucket_name)
|
config_path = self._bucket_config_path(bucket_name)
|
||||||
config_path.parent.mkdir(parents=True, exist_ok=True)
|
config_path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
config_path.write_text(json.dumps(payload), encoding="utf-8")
|
config_path.write_text(json.dumps(payload), encoding="utf-8")
|
||||||
# Performance: Update cache immediately after write
|
|
||||||
self._bucket_config_cache[bucket_name] = (payload.copy(), time.time())
|
self._bucket_config_cache[bucket_name] = (payload.copy(), time.time())
|
||||||
|
|
||||||
def _set_bucket_config_entry(self, bucket_name: str, key: str, value: Any | None) -> None:
|
def _set_bucket_config_entry(self, bucket_name: str, key: str, value: Any | None) -> None:
|
||||||
@@ -1638,7 +1664,6 @@ class ObjectStorage:
|
|||||||
def _check_bucket_contents(self, bucket_path: Path) -> tuple[bool, bool, bool]:
|
def _check_bucket_contents(self, bucket_path: Path) -> tuple[bool, bool, bool]:
|
||||||
"""Check bucket for objects, versions, and multipart uploads in a single pass.
|
"""Check bucket for objects, versions, and multipart uploads in a single pass.
|
||||||
|
|
||||||
Performance optimization: Combines three separate rglob traversals into one.
|
|
||||||
Returns (has_visible_objects, has_archived_versions, has_active_multipart_uploads).
|
Returns (has_visible_objects, has_archived_versions, has_active_multipart_uploads).
|
||||||
Uses early exit when all three are found.
|
Uses early exit when all three are found.
|
||||||
"""
|
"""
|
||||||
@@ -1647,7 +1672,6 @@ class ObjectStorage:
|
|||||||
has_multipart = False
|
has_multipart = False
|
||||||
bucket_name = bucket_path.name
|
bucket_name = bucket_path.name
|
||||||
|
|
||||||
# Check visible objects in bucket
|
|
||||||
for path in bucket_path.rglob("*"):
|
for path in bucket_path.rglob("*"):
|
||||||
if has_objects:
|
if has_objects:
|
||||||
break
|
break
|
||||||
@@ -1658,7 +1682,6 @@ class ObjectStorage:
|
|||||||
continue
|
continue
|
||||||
has_objects = True
|
has_objects = True
|
||||||
|
|
||||||
# Check archived versions (only if needed)
|
|
||||||
for version_root in (
|
for version_root in (
|
||||||
self._bucket_versions_root(bucket_name),
|
self._bucket_versions_root(bucket_name),
|
||||||
self._legacy_versions_root(bucket_name),
|
self._legacy_versions_root(bucket_name),
|
||||||
@@ -1671,7 +1694,6 @@ class ObjectStorage:
|
|||||||
has_versions = True
|
has_versions = True
|
||||||
break
|
break
|
||||||
|
|
||||||
# Check multipart uploads (only if needed)
|
|
||||||
for uploads_root in (
|
for uploads_root in (
|
||||||
self._multipart_bucket_root(bucket_name),
|
self._multipart_bucket_root(bucket_name),
|
||||||
self._legacy_multipart_bucket_root(bucket_name),
|
self._legacy_multipart_bucket_root(bucket_name),
|
||||||
@@ -1705,7 +1727,7 @@ class ObjectStorage:
|
|||||||
try:
|
try:
|
||||||
os.chmod(target_path, stat.S_IRWXU)
|
os.chmod(target_path, stat.S_IRWXU)
|
||||||
func(target_path)
|
func(target_path)
|
||||||
except Exception as exc: # pragma: no cover - fallback failure
|
except Exception as exc:
|
||||||
raise StorageError(f"Unable to delete bucket contents: {exc}") from exc
|
raise StorageError(f"Unable to delete bucket contents: {exc}") from exc
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
APP_VERSION = "0.2.0"
|
APP_VERSION = "0.2.1"
|
||||||
|
|
||||||
|
|
||||||
def get_version() -> str:
|
def get_version() -> str:
|
||||||
|
|||||||
46
docs.md
46
docs.md
@@ -189,6 +189,52 @@ All configuration is done via environment variables. The table below lists every
|
|||||||
| `KMS_ENABLED` | `false` | Enable KMS key management for encryption. |
|
| `KMS_ENABLED` | `false` | Enable KMS key management for encryption. |
|
||||||
| `KMS_KEYS_PATH` | `data/.myfsio.sys/keys/kms_keys.json` | Path to store KMS key metadata. |
|
| `KMS_KEYS_PATH` | `data/.myfsio.sys/keys/kms_keys.json` | Path to store KMS key metadata. |
|
||||||
|
|
||||||
|
|
||||||
|
## Lifecycle Rules
|
||||||
|
|
||||||
|
Lifecycle rules automate object management by scheduling deletions based on object age.
|
||||||
|
|
||||||
|
### Enabling Lifecycle Enforcement
|
||||||
|
|
||||||
|
By default, lifecycle enforcement is disabled. Enable it by setting the environment variable:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
LIFECYCLE_ENABLED=true python run.py
|
||||||
|
```
|
||||||
|
|
||||||
|
Or in your `myfsio.env` file:
|
||||||
|
```
|
||||||
|
LIFECYCLE_ENABLED=true
|
||||||
|
LIFECYCLE_INTERVAL_SECONDS=3600 # Check interval (default: 1 hour)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Configuring Rules
|
||||||
|
|
||||||
|
Once enabled, configure lifecycle rules via:
|
||||||
|
- **Web UI:** Bucket Details → Lifecycle tab → Add Rule
|
||||||
|
- **S3 API:** `PUT /<bucket>?lifecycle` with XML configuration
|
||||||
|
|
||||||
|
### Available Actions
|
||||||
|
|
||||||
|
| Action | Description |
|
||||||
|
|--------|-------------|
|
||||||
|
| **Expiration** | Delete current version objects after N days |
|
||||||
|
| **NoncurrentVersionExpiration** | Delete old versions N days after becoming noncurrent (requires versioning) |
|
||||||
|
| **AbortIncompleteMultipartUpload** | Clean up incomplete multipart uploads after N days |
|
||||||
|
|
||||||
|
### Example Configuration (XML)
|
||||||
|
|
||||||
|
```xml
|
||||||
|
<LifecycleConfiguration>
|
||||||
|
<Rule>
|
||||||
|
<ID>DeleteOldLogs</ID>
|
||||||
|
<Status>Enabled</Status>
|
||||||
|
<Filter><Prefix>logs/</Prefix></Filter>
|
||||||
|
<Expiration><Days>30</Days></Expiration>
|
||||||
|
</Rule>
|
||||||
|
</LifecycleConfiguration>
|
||||||
|
```
|
||||||
|
|
||||||
### Performance Tuning
|
### Performance Tuning
|
||||||
|
|
||||||
| Variable | Default | Notes |
|
| Variable | Default | Notes |
|
||||||
|
|||||||
@@ -1,3 +1,5 @@
|
|||||||
[pytest]
|
[pytest]
|
||||||
testpaths = tests
|
testpaths = tests
|
||||||
norecursedirs = data .git __pycache__ .venv
|
norecursedirs = data .git __pycache__ .venv
|
||||||
|
markers =
|
||||||
|
integration: marks tests as integration tests (may require external services)
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ Flask>=3.1.2
|
|||||||
Flask-Limiter>=4.1.1
|
Flask-Limiter>=4.1.1
|
||||||
Flask-Cors>=6.0.2
|
Flask-Cors>=6.0.2
|
||||||
Flask-WTF>=1.2.2
|
Flask-WTF>=1.2.2
|
||||||
|
python-dotenv>=1.2.1
|
||||||
pytest>=9.0.2
|
pytest>=9.0.2
|
||||||
requests>=2.32.5
|
requests>=2.32.5
|
||||||
boto3>=1.42.14
|
boto3>=1.42.14
|
||||||
|
|||||||
11
run.py
11
run.py
@@ -6,6 +6,17 @@ import os
|
|||||||
import sys
|
import sys
|
||||||
import warnings
|
import warnings
|
||||||
from multiprocessing import Process
|
from multiprocessing import Process
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
from dotenv import load_dotenv
|
||||||
|
|
||||||
|
for _env_file in [
|
||||||
|
Path("/opt/myfsio/myfsio.env"),
|
||||||
|
Path.cwd() / ".env",
|
||||||
|
Path.cwd() / "myfsio.env",
|
||||||
|
]:
|
||||||
|
if _env_file.exists():
|
||||||
|
load_dotenv(_env_file, override=True)
|
||||||
|
|
||||||
from app import create_api_app, create_ui_app
|
from app import create_api_app, create_ui_app
|
||||||
from app.config import AppConfig
|
from app.config import AppConfig
|
||||||
|
|||||||
1051
static/css/main.css
1051
static/css/main.css
File diff suppressed because it is too large
Load Diff
4169
static/js/bucket-detail-main.js
Normal file
4169
static/js/bucket-detail-main.js
Normal file
File diff suppressed because it is too large
Load Diff
192
static/js/bucket-detail-operations.js
Normal file
192
static/js/bucket-detail-operations.js
Normal file
@@ -0,0 +1,192 @@
|
|||||||
|
window.BucketDetailOperations = (function() {
|
||||||
|
'use strict';
|
||||||
|
|
||||||
|
let showMessage = function() {};
|
||||||
|
let escapeHtml = function(s) { return s; };
|
||||||
|
|
||||||
|
function init(config) {
|
||||||
|
showMessage = config.showMessage || showMessage;
|
||||||
|
escapeHtml = config.escapeHtml || escapeHtml;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function loadLifecycleRules(card, endpoint) {
|
||||||
|
if (!card || !endpoint) return;
|
||||||
|
const body = card.querySelector('[data-lifecycle-body]');
|
||||||
|
if (!body) return;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch(endpoint);
|
||||||
|
const data = await response.json();
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
body.innerHTML = `<tr><td colspan="5" class="text-center text-danger py-3">${escapeHtml(data.error || 'Failed to load')}</td></tr>`;
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const rules = data.rules || [];
|
||||||
|
if (rules.length === 0) {
|
||||||
|
body.innerHTML = '<tr><td colspan="5" class="text-center text-muted py-3">No lifecycle rules configured</td></tr>';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
body.innerHTML = rules.map(rule => {
|
||||||
|
const actions = [];
|
||||||
|
if (rule.expiration_days) actions.push(`Delete after ${rule.expiration_days} days`);
|
||||||
|
if (rule.noncurrent_days) actions.push(`Delete old versions after ${rule.noncurrent_days} days`);
|
||||||
|
if (rule.abort_mpu_days) actions.push(`Abort incomplete MPU after ${rule.abort_mpu_days} days`);
|
||||||
|
|
||||||
|
return `
|
||||||
|
<tr>
|
||||||
|
<td class="fw-medium">${escapeHtml(rule.id)}</td>
|
||||||
|
<td><code>${escapeHtml(rule.prefix || '(all)')}</code></td>
|
||||||
|
<td>${actions.map(a => `<div class="small">${escapeHtml(a)}</div>`).join('')}</td>
|
||||||
|
<td>
|
||||||
|
<span class="badge ${rule.status === 'Enabled' ? 'text-bg-success' : 'text-bg-secondary'}">${escapeHtml(rule.status)}</span>
|
||||||
|
</td>
|
||||||
|
<td class="text-end">
|
||||||
|
<button class="btn btn-sm btn-outline-danger" onclick="BucketDetailOperations.deleteLifecycleRule('${escapeHtml(rule.id)}')">
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="12" height="12" fill="currentColor" viewBox="0 0 16 16">
|
||||||
|
<path d="M5.5 5.5A.5.5 0 0 1 6 6v6a.5.5 0 0 1-1 0V6a.5.5 0 0 1 .5-.5zm2.5 0a.5.5 0 0 1 .5.5v6a.5.5 0 0 1-1 0V6a.5.5 0 0 1 .5-.5zm3 .5a.5.5 0 0 0-1 0v6a.5.5 0 0 0 1 0V6z"/>
|
||||||
|
<path fill-rule="evenodd" d="M14.5 3a1 1 0 0 1-1 1H13v9a2 2 0 0 1-2 2H5a2 2 0 0 1-2-2V4h-.5a1 1 0 0 1-1-1V2a1 1 0 0 1 1-1H6a1 1 0 0 1 1-1h2a1 1 0 0 1 1 1h3.5a1 1 0 0 1 1 1v1zM4.118 4 4 4.059V13a1 1 0 0 0 1 1h6a1 1 0 0 0 1-1V4.059L11.882 4H4.118zM2.5 3V2h11v1h-11z"/>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
`;
|
||||||
|
}).join('');
|
||||||
|
} catch (err) {
|
||||||
|
body.innerHTML = `<tr><td colspan="5" class="text-center text-danger py-3">${escapeHtml(err.message)}</td></tr>`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function loadCorsRules(card, endpoint) {
|
||||||
|
if (!card || !endpoint) return;
|
||||||
|
const body = document.getElementById('cors-rules-body');
|
||||||
|
if (!body) return;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch(endpoint);
|
||||||
|
const data = await response.json();
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
body.innerHTML = `<tr><td colspan="5" class="text-center text-danger py-3">${escapeHtml(data.error || 'Failed to load')}</td></tr>`;
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const rules = data.rules || [];
|
||||||
|
if (rules.length === 0) {
|
||||||
|
body.innerHTML = '<tr><td colspan="5" class="text-center text-muted py-3">No CORS rules configured</td></tr>';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
body.innerHTML = rules.map((rule, idx) => `
|
||||||
|
<tr>
|
||||||
|
<td>${(rule.allowed_origins || []).map(o => `<code class="d-block">${escapeHtml(o)}</code>`).join('')}</td>
|
||||||
|
<td>${(rule.allowed_methods || []).map(m => `<span class="badge text-bg-secondary me-1">${escapeHtml(m)}</span>`).join('')}</td>
|
||||||
|
<td class="small text-muted">${(rule.allowed_headers || []).slice(0, 3).join(', ')}${(rule.allowed_headers || []).length > 3 ? '...' : ''}</td>
|
||||||
|
<td class="text-muted">${rule.max_age_seconds || 0}s</td>
|
||||||
|
<td class="text-end">
|
||||||
|
<button class="btn btn-sm btn-outline-danger" onclick="BucketDetailOperations.deleteCorsRule(${idx})">
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="12" height="12" fill="currentColor" viewBox="0 0 16 16">
|
||||||
|
<path d="M5.5 5.5A.5.5 0 0 1 6 6v6a.5.5 0 0 1-1 0V6a.5.5 0 0 1 .5-.5zm2.5 0a.5.5 0 0 1 .5.5v6a.5.5 0 0 1-1 0V6a.5.5 0 0 1 .5-.5zm3 .5a.5.5 0 0 0-1 0v6a.5.5 0 0 0 1 0V6z"/>
|
||||||
|
<path fill-rule="evenodd" d="M14.5 3a1 1 0 0 1-1 1H13v9a2 2 0 0 1-2 2H5a2 2 0 0 1-2-2V4h-.5a1 1 0 0 1-1-1V2a1 1 0 0 1 1-1H6a1 1 0 0 1 1-1h2a1 1 0 0 1 1 1h3.5a1 1 0 0 1 1 1v1zM4.118 4 4 4.059V13a1 1 0 0 0 1 1h6a1 1 0 0 0 1-1V4.059L11.882 4H4.118zM2.5 3V2h11v1h-11z"/>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
`).join('');
|
||||||
|
} catch (err) {
|
||||||
|
body.innerHTML = `<tr><td colspan="5" class="text-center text-danger py-3">${escapeHtml(err.message)}</td></tr>`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function loadAcl(card, endpoint) {
|
||||||
|
if (!card || !endpoint) return;
|
||||||
|
const body = card.querySelector('[data-acl-body]');
|
||||||
|
if (!body) return;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch(endpoint);
|
||||||
|
const data = await response.json();
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
body.innerHTML = `<tr><td colspan="3" class="text-center text-danger py-3">${escapeHtml(data.error || 'Failed to load')}</td></tr>`;
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const grants = data.grants || [];
|
||||||
|
if (grants.length === 0) {
|
||||||
|
body.innerHTML = '<tr><td colspan="3" class="text-center text-muted py-3">No ACL grants configured</td></tr>';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
body.innerHTML = grants.map(grant => {
|
||||||
|
const grantee = grant.grantee_type === 'CanonicalUser'
|
||||||
|
? grant.display_name || grant.grantee_id
|
||||||
|
: grant.grantee_uri || grant.grantee_type;
|
||||||
|
return `
|
||||||
|
<tr>
|
||||||
|
<td class="fw-medium">${escapeHtml(grantee)}</td>
|
||||||
|
<td><span class="badge text-bg-info">${escapeHtml(grant.permission)}</span></td>
|
||||||
|
<td class="text-muted small">${escapeHtml(grant.grantee_type)}</td>
|
||||||
|
</tr>
|
||||||
|
`;
|
||||||
|
}).join('');
|
||||||
|
} catch (err) {
|
||||||
|
body.innerHTML = `<tr><td colspan="3" class="text-center text-danger py-3">${escapeHtml(err.message)}</td></tr>`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function deleteLifecycleRule(ruleId) {
|
||||||
|
if (!confirm(`Delete lifecycle rule "${ruleId}"?`)) return;
|
||||||
|
const card = document.getElementById('lifecycle-rules-card');
|
||||||
|
if (!card) return;
|
||||||
|
const endpoint = card.dataset.lifecycleUrl;
|
||||||
|
const csrfToken = window.getCsrfToken ? window.getCsrfToken() : '';
|
||||||
|
|
||||||
|
try {
|
||||||
|
const resp = await fetch(endpoint, {
|
||||||
|
method: 'DELETE',
|
||||||
|
headers: { 'Content-Type': 'application/json', 'X-CSRFToken': csrfToken },
|
||||||
|
body: JSON.stringify({ rule_id: ruleId })
|
||||||
|
});
|
||||||
|
const data = await resp.json();
|
||||||
|
if (!resp.ok) throw new Error(data.error || 'Failed to delete');
|
||||||
|
showMessage({ title: 'Rule deleted', body: `Lifecycle rule "${ruleId}" has been deleted.`, variant: 'success' });
|
||||||
|
loadLifecycleRules(card, endpoint);
|
||||||
|
} catch (err) {
|
||||||
|
showMessage({ title: 'Delete failed', body: err.message, variant: 'danger' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function deleteCorsRule(index) {
|
||||||
|
if (!confirm('Delete this CORS rule?')) return;
|
||||||
|
const card = document.getElementById('cors-rules-card');
|
||||||
|
if (!card) return;
|
||||||
|
const endpoint = card.dataset.corsUrl;
|
||||||
|
const csrfToken = window.getCsrfToken ? window.getCsrfToken() : '';
|
||||||
|
|
||||||
|
try {
|
||||||
|
const resp = await fetch(endpoint, {
|
||||||
|
method: 'DELETE',
|
||||||
|
headers: { 'Content-Type': 'application/json', 'X-CSRFToken': csrfToken },
|
||||||
|
body: JSON.stringify({ rule_index: index })
|
||||||
|
});
|
||||||
|
const data = await resp.json();
|
||||||
|
if (!resp.ok) throw new Error(data.error || 'Failed to delete');
|
||||||
|
showMessage({ title: 'Rule deleted', body: 'CORS rule has been deleted.', variant: 'success' });
|
||||||
|
loadCorsRules(card, endpoint);
|
||||||
|
} catch (err) {
|
||||||
|
showMessage({ title: 'Delete failed', body: err.message, variant: 'danger' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
init: init,
|
||||||
|
loadLifecycleRules: loadLifecycleRules,
|
||||||
|
loadCorsRules: loadCorsRules,
|
||||||
|
loadAcl: loadAcl,
|
||||||
|
deleteLifecycleRule: deleteLifecycleRule,
|
||||||
|
deleteCorsRule: deleteCorsRule
|
||||||
|
};
|
||||||
|
})();
|
||||||
548
static/js/bucket-detail-upload.js
Normal file
548
static/js/bucket-detail-upload.js
Normal file
@@ -0,0 +1,548 @@
|
|||||||
|
window.BucketDetailUpload = (function() {
|
||||||
|
'use strict';
|
||||||
|
|
||||||
|
const MULTIPART_THRESHOLD = 8 * 1024 * 1024;
|
||||||
|
const CHUNK_SIZE = 8 * 1024 * 1024;
|
||||||
|
|
||||||
|
let state = {
|
||||||
|
isUploading: false,
|
||||||
|
uploadProgress: { current: 0, total: 0, currentFile: '' }
|
||||||
|
};
|
||||||
|
|
||||||
|
let elements = {};
|
||||||
|
let callbacks = {};
|
||||||
|
|
||||||
|
function init(config) {
|
||||||
|
elements = {
|
||||||
|
uploadForm: config.uploadForm,
|
||||||
|
uploadFileInput: config.uploadFileInput,
|
||||||
|
uploadModal: config.uploadModal,
|
||||||
|
uploadModalEl: config.uploadModalEl,
|
||||||
|
uploadSubmitBtn: config.uploadSubmitBtn,
|
||||||
|
uploadCancelBtn: config.uploadCancelBtn,
|
||||||
|
uploadBtnText: config.uploadBtnText,
|
||||||
|
uploadDropZone: config.uploadDropZone,
|
||||||
|
uploadDropZoneLabel: config.uploadDropZoneLabel,
|
||||||
|
uploadProgressStack: config.uploadProgressStack,
|
||||||
|
uploadKeyPrefix: config.uploadKeyPrefix,
|
||||||
|
singleFileOptions: config.singleFileOptions,
|
||||||
|
bulkUploadProgress: config.bulkUploadProgress,
|
||||||
|
bulkUploadStatus: config.bulkUploadStatus,
|
||||||
|
bulkUploadCounter: config.bulkUploadCounter,
|
||||||
|
bulkUploadProgressBar: config.bulkUploadProgressBar,
|
||||||
|
bulkUploadCurrentFile: config.bulkUploadCurrentFile,
|
||||||
|
bulkUploadResults: config.bulkUploadResults,
|
||||||
|
bulkUploadSuccessAlert: config.bulkUploadSuccessAlert,
|
||||||
|
bulkUploadErrorAlert: config.bulkUploadErrorAlert,
|
||||||
|
bulkUploadSuccessCount: config.bulkUploadSuccessCount,
|
||||||
|
bulkUploadErrorCount: config.bulkUploadErrorCount,
|
||||||
|
bulkUploadErrorList: config.bulkUploadErrorList,
|
||||||
|
floatingProgress: config.floatingProgress,
|
||||||
|
floatingProgressBar: config.floatingProgressBar,
|
||||||
|
floatingProgressStatus: config.floatingProgressStatus,
|
||||||
|
floatingProgressTitle: config.floatingProgressTitle,
|
||||||
|
floatingProgressExpand: config.floatingProgressExpand
|
||||||
|
};
|
||||||
|
|
||||||
|
callbacks = {
|
||||||
|
showMessage: config.showMessage || function() {},
|
||||||
|
formatBytes: config.formatBytes || function(b) { return b + ' bytes'; },
|
||||||
|
escapeHtml: config.escapeHtml || function(s) { return s; },
|
||||||
|
onUploadComplete: config.onUploadComplete || function() {},
|
||||||
|
hasFolders: config.hasFolders || function() { return false; },
|
||||||
|
getCurrentPrefix: config.getCurrentPrefix || function() { return ''; }
|
||||||
|
};
|
||||||
|
|
||||||
|
setupEventListeners();
|
||||||
|
setupBeforeUnload();
|
||||||
|
}
|
||||||
|
|
||||||
|
function isUploading() {
|
||||||
|
return state.isUploading;
|
||||||
|
}
|
||||||
|
|
||||||
|
function setupBeforeUnload() {
|
||||||
|
window.addEventListener('beforeunload', (e) => {
|
||||||
|
if (state.isUploading) {
|
||||||
|
e.preventDefault();
|
||||||
|
e.returnValue = 'Upload in progress. Are you sure you want to leave?';
|
||||||
|
return e.returnValue;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function showFloatingProgress() {
|
||||||
|
if (elements.floatingProgress) {
|
||||||
|
elements.floatingProgress.classList.remove('d-none');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function hideFloatingProgress() {
|
||||||
|
if (elements.floatingProgress) {
|
||||||
|
elements.floatingProgress.classList.add('d-none');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function updateFloatingProgress(current, total, currentFile) {
|
||||||
|
state.uploadProgress = { current, total, currentFile: currentFile || '' };
|
||||||
|
if (elements.floatingProgressBar && total > 0) {
|
||||||
|
const percent = Math.round((current / total) * 100);
|
||||||
|
elements.floatingProgressBar.style.width = `${percent}%`;
|
||||||
|
}
|
||||||
|
if (elements.floatingProgressStatus) {
|
||||||
|
if (currentFile) {
|
||||||
|
elements.floatingProgressStatus.textContent = `${current}/${total} files - ${currentFile}`;
|
||||||
|
} else {
|
||||||
|
elements.floatingProgressStatus.textContent = `${current}/${total} files completed`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (elements.floatingProgressTitle) {
|
||||||
|
elements.floatingProgressTitle.textContent = `Uploading ${total} file${total !== 1 ? 's' : ''}...`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function refreshUploadDropLabel() {
|
||||||
|
if (!elements.uploadDropZoneLabel || !elements.uploadFileInput) return;
|
||||||
|
const files = elements.uploadFileInput.files;
|
||||||
|
if (!files || files.length === 0) {
|
||||||
|
elements.uploadDropZoneLabel.textContent = 'No file selected';
|
||||||
|
if (elements.singleFileOptions) elements.singleFileOptions.classList.remove('d-none');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
elements.uploadDropZoneLabel.textContent = files.length === 1 ? files[0].name : `${files.length} files selected`;
|
||||||
|
if (elements.singleFileOptions) {
|
||||||
|
elements.singleFileOptions.classList.toggle('d-none', files.length > 1);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function updateUploadBtnText() {
|
||||||
|
if (!elements.uploadBtnText || !elements.uploadFileInput) return;
|
||||||
|
const files = elements.uploadFileInput.files;
|
||||||
|
if (!files || files.length <= 1) {
|
||||||
|
elements.uploadBtnText.textContent = 'Upload';
|
||||||
|
} else {
|
||||||
|
elements.uploadBtnText.textContent = `Upload ${files.length} files`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function resetUploadUI() {
|
||||||
|
if (elements.bulkUploadProgress) elements.bulkUploadProgress.classList.add('d-none');
|
||||||
|
if (elements.bulkUploadResults) elements.bulkUploadResults.classList.add('d-none');
|
||||||
|
if (elements.bulkUploadSuccessAlert) elements.bulkUploadSuccessAlert.classList.remove('d-none');
|
||||||
|
if (elements.bulkUploadErrorAlert) elements.bulkUploadErrorAlert.classList.add('d-none');
|
||||||
|
if (elements.bulkUploadErrorList) elements.bulkUploadErrorList.innerHTML = '';
|
||||||
|
if (elements.uploadSubmitBtn) elements.uploadSubmitBtn.disabled = false;
|
||||||
|
if (elements.uploadFileInput) elements.uploadFileInput.disabled = false;
|
||||||
|
if (elements.uploadProgressStack) elements.uploadProgressStack.innerHTML = '';
|
||||||
|
if (elements.uploadDropZone) {
|
||||||
|
elements.uploadDropZone.classList.remove('upload-locked');
|
||||||
|
elements.uploadDropZone.style.pointerEvents = '';
|
||||||
|
}
|
||||||
|
state.isUploading = false;
|
||||||
|
hideFloatingProgress();
|
||||||
|
}
|
||||||
|
|
||||||
|
function setUploadLockState(locked) {
|
||||||
|
if (elements.uploadDropZone) {
|
||||||
|
elements.uploadDropZone.classList.toggle('upload-locked', locked);
|
||||||
|
elements.uploadDropZone.style.pointerEvents = locked ? 'none' : '';
|
||||||
|
}
|
||||||
|
if (elements.uploadFileInput) {
|
||||||
|
elements.uploadFileInput.disabled = locked;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function createProgressItem(file) {
|
||||||
|
const item = document.createElement('div');
|
||||||
|
item.className = 'upload-progress-item';
|
||||||
|
item.dataset.state = 'uploading';
|
||||||
|
item.innerHTML = `
|
||||||
|
<div class="d-flex justify-content-between align-items-start">
|
||||||
|
<div class="min-width-0 flex-grow-1">
|
||||||
|
<div class="file-name">${callbacks.escapeHtml(file.name)}</div>
|
||||||
|
<div class="file-size">${callbacks.formatBytes(file.size)}</div>
|
||||||
|
</div>
|
||||||
|
<div class="upload-status text-end ms-2">Preparing...</div>
|
||||||
|
</div>
|
||||||
|
<div class="progress-container">
|
||||||
|
<div class="progress">
|
||||||
|
<div class="progress-bar bg-primary" role="progressbar" style="width: 0%"></div>
|
||||||
|
</div>
|
||||||
|
<div class="progress-text">
|
||||||
|
<span class="progress-loaded">0 B</span>
|
||||||
|
<span class="progress-percent">0%</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
`;
|
||||||
|
return item;
|
||||||
|
}
|
||||||
|
|
||||||
|
function updateProgressItem(item, { loaded, total, status, progressState, error }) {
|
||||||
|
if (progressState) item.dataset.state = progressState;
|
||||||
|
const statusEl = item.querySelector('.upload-status');
|
||||||
|
const progressBar = item.querySelector('.progress-bar');
|
||||||
|
const progressLoaded = item.querySelector('.progress-loaded');
|
||||||
|
const progressPercent = item.querySelector('.progress-percent');
|
||||||
|
|
||||||
|
if (status) {
|
||||||
|
statusEl.textContent = status;
|
||||||
|
statusEl.className = 'upload-status text-end ms-2';
|
||||||
|
if (progressState === 'success') statusEl.classList.add('success');
|
||||||
|
if (progressState === 'error') statusEl.classList.add('error');
|
||||||
|
}
|
||||||
|
if (typeof loaded === 'number' && typeof total === 'number' && total > 0) {
|
||||||
|
const percent = Math.round((loaded / total) * 100);
|
||||||
|
progressBar.style.width = `${percent}%`;
|
||||||
|
progressLoaded.textContent = `${callbacks.formatBytes(loaded)} / ${callbacks.formatBytes(total)}`;
|
||||||
|
progressPercent.textContent = `${percent}%`;
|
||||||
|
}
|
||||||
|
if (error) {
|
||||||
|
const progressContainer = item.querySelector('.progress-container');
|
||||||
|
if (progressContainer) {
|
||||||
|
progressContainer.innerHTML = `<div class="text-danger small mt-1">${callbacks.escapeHtml(error)}</div>`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function uploadMultipart(file, objectKey, metadata, progressItem, urls) {
|
||||||
|
const csrfToken = document.querySelector('input[name="csrf_token"]')?.value;
|
||||||
|
|
||||||
|
updateProgressItem(progressItem, { status: 'Initiating...', loaded: 0, total: file.size });
|
||||||
|
const initResp = await fetch(urls.initUrl, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: { 'Content-Type': 'application/json', 'X-CSRFToken': csrfToken || '' },
|
||||||
|
body: JSON.stringify({ object_key: objectKey, metadata })
|
||||||
|
});
|
||||||
|
if (!initResp.ok) {
|
||||||
|
const err = await initResp.json().catch(() => ({}));
|
||||||
|
throw new Error(err.error || 'Failed to initiate upload');
|
||||||
|
}
|
||||||
|
const { upload_id } = await initResp.json();
|
||||||
|
|
||||||
|
const partUrl = urls.partTemplate.replace('UPLOAD_ID_PLACEHOLDER', upload_id);
|
||||||
|
const completeUrl = urls.completeTemplate.replace('UPLOAD_ID_PLACEHOLDER', upload_id);
|
||||||
|
const abortUrl = urls.abortTemplate.replace('UPLOAD_ID_PLACEHOLDER', upload_id);
|
||||||
|
|
||||||
|
const parts = [];
|
||||||
|
const totalParts = Math.ceil(file.size / CHUNK_SIZE);
|
||||||
|
let uploadedBytes = 0;
|
||||||
|
|
||||||
|
try {
|
||||||
|
for (let partNumber = 1; partNumber <= totalParts; partNumber++) {
|
||||||
|
const start = (partNumber - 1) * CHUNK_SIZE;
|
||||||
|
const end = Math.min(start + CHUNK_SIZE, file.size);
|
||||||
|
const chunk = file.slice(start, end);
|
||||||
|
|
||||||
|
updateProgressItem(progressItem, {
|
||||||
|
status: `Part ${partNumber}/${totalParts}`,
|
||||||
|
loaded: uploadedBytes,
|
||||||
|
total: file.size
|
||||||
|
});
|
||||||
|
|
||||||
|
const partResp = await fetch(`${partUrl}?partNumber=${partNumber}`, {
|
||||||
|
method: 'PUT',
|
||||||
|
headers: { 'X-CSRFToken': csrfToken || '' },
|
||||||
|
body: chunk
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!partResp.ok) {
|
||||||
|
const err = await partResp.json().catch(() => ({}));
|
||||||
|
throw new Error(err.error || `Part ${partNumber} failed`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const partData = await partResp.json();
|
||||||
|
parts.push({ part_number: partNumber, etag: partData.etag });
|
||||||
|
uploadedBytes += chunk.size;
|
||||||
|
|
||||||
|
updateProgressItem(progressItem, {
|
||||||
|
loaded: uploadedBytes,
|
||||||
|
total: file.size
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
updateProgressItem(progressItem, { status: 'Completing...', loaded: file.size, total: file.size });
|
||||||
|
const completeResp = await fetch(completeUrl, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: { 'Content-Type': 'application/json', 'X-CSRFToken': csrfToken || '' },
|
||||||
|
body: JSON.stringify({ parts })
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!completeResp.ok) {
|
||||||
|
const err = await completeResp.json().catch(() => ({}));
|
||||||
|
throw new Error(err.error || 'Failed to complete upload');
|
||||||
|
}
|
||||||
|
|
||||||
|
return await completeResp.json();
|
||||||
|
} catch (err) {
|
||||||
|
try {
|
||||||
|
await fetch(abortUrl, { method: 'DELETE', headers: { 'X-CSRFToken': csrfToken || '' } });
|
||||||
|
} catch {}
|
||||||
|
throw err;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function uploadRegular(file, objectKey, metadata, progressItem, formAction) {
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const formData = new FormData();
|
||||||
|
formData.append('object', file);
|
||||||
|
formData.append('object_key', objectKey);
|
||||||
|
if (metadata) formData.append('metadata', JSON.stringify(metadata));
|
||||||
|
const csrfToken = document.querySelector('input[name="csrf_token"]')?.value;
|
||||||
|
if (csrfToken) formData.append('csrf_token', csrfToken);
|
||||||
|
|
||||||
|
const xhr = new XMLHttpRequest();
|
||||||
|
xhr.open('POST', formAction, true);
|
||||||
|
xhr.setRequestHeader('X-Requested-With', 'XMLHttpRequest');
|
||||||
|
|
||||||
|
xhr.upload.addEventListener('progress', (e) => {
|
||||||
|
if (e.lengthComputable) {
|
||||||
|
updateProgressItem(progressItem, {
|
||||||
|
status: 'Uploading...',
|
||||||
|
loaded: e.loaded,
|
||||||
|
total: e.total
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
xhr.addEventListener('load', () => {
|
||||||
|
if (xhr.status >= 200 && xhr.status < 300) {
|
||||||
|
try {
|
||||||
|
const data = JSON.parse(xhr.responseText);
|
||||||
|
if (data.status === 'error') {
|
||||||
|
reject(new Error(data.message || 'Upload failed'));
|
||||||
|
} else {
|
||||||
|
resolve(data);
|
||||||
|
}
|
||||||
|
} catch {
|
||||||
|
resolve({});
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
try {
|
||||||
|
const data = JSON.parse(xhr.responseText);
|
||||||
|
reject(new Error(data.message || `Upload failed (${xhr.status})`));
|
||||||
|
} catch {
|
||||||
|
reject(new Error(`Upload failed (${xhr.status})`));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
xhr.addEventListener('error', () => reject(new Error('Network error')));
|
||||||
|
xhr.addEventListener('abort', () => reject(new Error('Upload aborted')));
|
||||||
|
|
||||||
|
xhr.send(formData);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async function uploadSingleFile(file, keyPrefix, metadata, progressItem, urls) {
|
||||||
|
const objectKey = keyPrefix ? `${keyPrefix}${file.name}` : file.name;
|
||||||
|
const shouldUseMultipart = file.size >= MULTIPART_THRESHOLD && urls.initUrl;
|
||||||
|
|
||||||
|
if (!progressItem && elements.uploadProgressStack) {
|
||||||
|
progressItem = createProgressItem(file);
|
||||||
|
elements.uploadProgressStack.appendChild(progressItem);
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
let result;
|
||||||
|
if (shouldUseMultipart) {
|
||||||
|
updateProgressItem(progressItem, { status: 'Multipart upload...', loaded: 0, total: file.size });
|
||||||
|
result = await uploadMultipart(file, objectKey, metadata, progressItem, urls);
|
||||||
|
} else {
|
||||||
|
updateProgressItem(progressItem, { status: 'Uploading...', loaded: 0, total: file.size });
|
||||||
|
result = await uploadRegular(file, objectKey, metadata, progressItem, urls.formAction);
|
||||||
|
}
|
||||||
|
updateProgressItem(progressItem, { progressState: 'success', status: 'Complete', loaded: file.size, total: file.size });
|
||||||
|
return result;
|
||||||
|
} catch (err) {
|
||||||
|
updateProgressItem(progressItem, { progressState: 'error', status: 'Failed', error: err.message });
|
||||||
|
throw err;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function performBulkUpload(files, urls) {
|
||||||
|
if (state.isUploading || !files || files.length === 0) return;
|
||||||
|
|
||||||
|
state.isUploading = true;
|
||||||
|
setUploadLockState(true);
|
||||||
|
const keyPrefix = (elements.uploadKeyPrefix?.value || '').trim();
|
||||||
|
const metadataRaw = elements.uploadForm?.querySelector('textarea[name="metadata"]')?.value?.trim();
|
||||||
|
let metadata = null;
|
||||||
|
if (metadataRaw) {
|
||||||
|
try {
|
||||||
|
metadata = JSON.parse(metadataRaw);
|
||||||
|
} catch {
|
||||||
|
callbacks.showMessage({ title: 'Invalid metadata', body: 'Metadata must be valid JSON.', variant: 'danger' });
|
||||||
|
resetUploadUI();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (elements.bulkUploadProgress) elements.bulkUploadProgress.classList.remove('d-none');
|
||||||
|
if (elements.bulkUploadResults) elements.bulkUploadResults.classList.add('d-none');
|
||||||
|
if (elements.uploadSubmitBtn) elements.uploadSubmitBtn.disabled = true;
|
||||||
|
if (elements.uploadFileInput) elements.uploadFileInput.disabled = true;
|
||||||
|
|
||||||
|
const successFiles = [];
|
||||||
|
const errorFiles = [];
|
||||||
|
const total = files.length;
|
||||||
|
|
||||||
|
updateFloatingProgress(0, total, files[0]?.name || '');
|
||||||
|
|
||||||
|
for (let i = 0; i < total; i++) {
|
||||||
|
const file = files[i];
|
||||||
|
const current = i + 1;
|
||||||
|
|
||||||
|
if (elements.bulkUploadCounter) elements.bulkUploadCounter.textContent = `${current}/${total}`;
|
||||||
|
if (elements.bulkUploadCurrentFile) elements.bulkUploadCurrentFile.textContent = `Uploading: ${file.name}`;
|
||||||
|
if (elements.bulkUploadProgressBar) {
|
||||||
|
const percent = Math.round((current / total) * 100);
|
||||||
|
elements.bulkUploadProgressBar.style.width = `${percent}%`;
|
||||||
|
}
|
||||||
|
updateFloatingProgress(i, total, file.name);
|
||||||
|
|
||||||
|
try {
|
||||||
|
await uploadSingleFile(file, keyPrefix, metadata, null, urls);
|
||||||
|
successFiles.push(file.name);
|
||||||
|
} catch (error) {
|
||||||
|
errorFiles.push({ name: file.name, error: error.message || 'Unknown error' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
updateFloatingProgress(total, total);
|
||||||
|
|
||||||
|
if (elements.bulkUploadProgress) elements.bulkUploadProgress.classList.add('d-none');
|
||||||
|
if (elements.bulkUploadResults) elements.bulkUploadResults.classList.remove('d-none');
|
||||||
|
|
||||||
|
if (elements.bulkUploadSuccessCount) elements.bulkUploadSuccessCount.textContent = successFiles.length;
|
||||||
|
if (successFiles.length === 0 && elements.bulkUploadSuccessAlert) {
|
||||||
|
elements.bulkUploadSuccessAlert.classList.add('d-none');
|
||||||
|
}
|
||||||
|
|
||||||
|
if (errorFiles.length > 0) {
|
||||||
|
if (elements.bulkUploadErrorCount) elements.bulkUploadErrorCount.textContent = errorFiles.length;
|
||||||
|
if (elements.bulkUploadErrorAlert) elements.bulkUploadErrorAlert.classList.remove('d-none');
|
||||||
|
if (elements.bulkUploadErrorList) {
|
||||||
|
elements.bulkUploadErrorList.innerHTML = errorFiles
|
||||||
|
.map(f => `<li><strong>${callbacks.escapeHtml(f.name)}</strong>: ${callbacks.escapeHtml(f.error)}</li>`)
|
||||||
|
.join('');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
state.isUploading = false;
|
||||||
|
setUploadLockState(false);
|
||||||
|
|
||||||
|
if (successFiles.length > 0) {
|
||||||
|
if (elements.uploadBtnText) elements.uploadBtnText.textContent = 'Refreshing...';
|
||||||
|
callbacks.onUploadComplete(successFiles, errorFiles);
|
||||||
|
} else {
|
||||||
|
if (elements.uploadSubmitBtn) elements.uploadSubmitBtn.disabled = false;
|
||||||
|
if (elements.uploadFileInput) elements.uploadFileInput.disabled = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function setupEventListeners() {
|
||||||
|
if (elements.uploadFileInput) {
|
||||||
|
elements.uploadFileInput.addEventListener('change', () => {
|
||||||
|
if (state.isUploading) return;
|
||||||
|
refreshUploadDropLabel();
|
||||||
|
updateUploadBtnText();
|
||||||
|
resetUploadUI();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (elements.uploadDropZone) {
|
||||||
|
elements.uploadDropZone.addEventListener('click', () => {
|
||||||
|
if (state.isUploading) return;
|
||||||
|
elements.uploadFileInput?.click();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (elements.floatingProgressExpand) {
|
||||||
|
elements.floatingProgressExpand.addEventListener('click', () => {
|
||||||
|
if (elements.uploadModal) {
|
||||||
|
elements.uploadModal.show();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (elements.uploadModalEl) {
|
||||||
|
elements.uploadModalEl.addEventListener('hide.bs.modal', () => {
|
||||||
|
if (state.isUploading) {
|
||||||
|
showFloatingProgress();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
elements.uploadModalEl.addEventListener('hidden.bs.modal', () => {
|
||||||
|
if (!state.isUploading) {
|
||||||
|
resetUploadUI();
|
||||||
|
if (elements.uploadFileInput) elements.uploadFileInput.value = '';
|
||||||
|
refreshUploadDropLabel();
|
||||||
|
updateUploadBtnText();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
elements.uploadModalEl.addEventListener('show.bs.modal', () => {
|
||||||
|
if (state.isUploading) {
|
||||||
|
hideFloatingProgress();
|
||||||
|
}
|
||||||
|
if (callbacks.hasFolders() && callbacks.getCurrentPrefix()) {
|
||||||
|
if (elements.uploadKeyPrefix) {
|
||||||
|
elements.uploadKeyPrefix.value = callbacks.getCurrentPrefix();
|
||||||
|
}
|
||||||
|
} else if (elements.uploadKeyPrefix) {
|
||||||
|
elements.uploadKeyPrefix.value = '';
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function wireDropTarget(target, options) {
|
||||||
|
const { highlightClass = '', autoOpenModal = false } = options || {};
|
||||||
|
if (!target) return;
|
||||||
|
|
||||||
|
const preventDefaults = (event) => {
|
||||||
|
event.preventDefault();
|
||||||
|
event.stopPropagation();
|
||||||
|
};
|
||||||
|
|
||||||
|
['dragenter', 'dragover'].forEach((eventName) => {
|
||||||
|
target.addEventListener(eventName, (event) => {
|
||||||
|
preventDefaults(event);
|
||||||
|
if (state.isUploading) return;
|
||||||
|
if (highlightClass) {
|
||||||
|
target.classList.add(highlightClass);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
['dragleave', 'drop'].forEach((eventName) => {
|
||||||
|
target.addEventListener(eventName, (event) => {
|
||||||
|
preventDefaults(event);
|
||||||
|
if (highlightClass) {
|
||||||
|
target.classList.remove(highlightClass);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
target.addEventListener('drop', (event) => {
|
||||||
|
if (state.isUploading) return;
|
||||||
|
if (!event.dataTransfer?.files?.length || !elements.uploadFileInput) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
elements.uploadFileInput.files = event.dataTransfer.files;
|
||||||
|
elements.uploadFileInput.dispatchEvent(new Event('change', { bubbles: true }));
|
||||||
|
if (autoOpenModal && elements.uploadModal) {
|
||||||
|
elements.uploadModal.show();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
init: init,
|
||||||
|
isUploading: isUploading,
|
||||||
|
performBulkUpload: performBulkUpload,
|
||||||
|
wireDropTarget: wireDropTarget,
|
||||||
|
resetUploadUI: resetUploadUI,
|
||||||
|
refreshUploadDropLabel: refreshUploadDropLabel,
|
||||||
|
updateUploadBtnText: updateUploadBtnText
|
||||||
|
};
|
||||||
|
})();
|
||||||
120
static/js/bucket-detail-utils.js
Normal file
120
static/js/bucket-detail-utils.js
Normal file
@@ -0,0 +1,120 @@
|
|||||||
|
window.BucketDetailUtils = (function() {
|
||||||
|
'use strict';
|
||||||
|
|
||||||
|
function setupJsonAutoIndent(textarea) {
|
||||||
|
if (!textarea) return;
|
||||||
|
|
||||||
|
textarea.addEventListener('keydown', function(e) {
|
||||||
|
if (e.key === 'Enter') {
|
||||||
|
e.preventDefault();
|
||||||
|
|
||||||
|
const start = this.selectionStart;
|
||||||
|
const end = this.selectionEnd;
|
||||||
|
const value = this.value;
|
||||||
|
|
||||||
|
const lineStart = value.lastIndexOf('\n', start - 1) + 1;
|
||||||
|
const currentLine = value.substring(lineStart, start);
|
||||||
|
|
||||||
|
const indentMatch = currentLine.match(/^(\s*)/);
|
||||||
|
let indent = indentMatch ? indentMatch[1] : '';
|
||||||
|
|
||||||
|
const trimmedLine = currentLine.trim();
|
||||||
|
const lastChar = trimmedLine.slice(-1);
|
||||||
|
|
||||||
|
let newIndent = indent;
|
||||||
|
let insertAfter = '';
|
||||||
|
|
||||||
|
if (lastChar === '{' || lastChar === '[') {
|
||||||
|
newIndent = indent + ' ';
|
||||||
|
|
||||||
|
const charAfterCursor = value.substring(start, start + 1).trim();
|
||||||
|
if ((lastChar === '{' && charAfterCursor === '}') ||
|
||||||
|
(lastChar === '[' && charAfterCursor === ']')) {
|
||||||
|
insertAfter = '\n' + indent;
|
||||||
|
}
|
||||||
|
} else if (lastChar === ',' || lastChar === ':') {
|
||||||
|
newIndent = indent;
|
||||||
|
}
|
||||||
|
|
||||||
|
const insertion = '\n' + newIndent + insertAfter;
|
||||||
|
const newValue = value.substring(0, start) + insertion + value.substring(end);
|
||||||
|
|
||||||
|
this.value = newValue;
|
||||||
|
|
||||||
|
const newCursorPos = start + 1 + newIndent.length;
|
||||||
|
this.selectionStart = this.selectionEnd = newCursorPos;
|
||||||
|
|
||||||
|
this.dispatchEvent(new Event('input', { bubbles: true }));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (e.key === 'Tab') {
|
||||||
|
e.preventDefault();
|
||||||
|
const start = this.selectionStart;
|
||||||
|
const end = this.selectionEnd;
|
||||||
|
|
||||||
|
if (e.shiftKey) {
|
||||||
|
const lineStart = this.value.lastIndexOf('\n', start - 1) + 1;
|
||||||
|
const lineContent = this.value.substring(lineStart, start);
|
||||||
|
if (lineContent.startsWith(' ')) {
|
||||||
|
this.value = this.value.substring(0, lineStart) +
|
||||||
|
this.value.substring(lineStart + 2);
|
||||||
|
this.selectionStart = this.selectionEnd = Math.max(lineStart, start - 2);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
this.value = this.value.substring(0, start) + ' ' + this.value.substring(end);
|
||||||
|
this.selectionStart = this.selectionEnd = start + 2;
|
||||||
|
}
|
||||||
|
|
||||||
|
this.dispatchEvent(new Event('input', { bubbles: true }));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatBytes(bytes) {
|
||||||
|
if (!Number.isFinite(bytes)) return `${bytes} bytes`;
|
||||||
|
const units = ['bytes', 'KB', 'MB', 'GB', 'TB'];
|
||||||
|
let i = 0;
|
||||||
|
let size = bytes;
|
||||||
|
while (size >= 1024 && i < units.length - 1) {
|
||||||
|
size /= 1024;
|
||||||
|
i++;
|
||||||
|
}
|
||||||
|
return `${size.toFixed(i === 0 ? 0 : 1)} ${units[i]}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function escapeHtml(value) {
|
||||||
|
if (value === null || value === undefined) return '';
|
||||||
|
return String(value)
|
||||||
|
.replace(/&/g, '&')
|
||||||
|
.replace(/</g, '<')
|
||||||
|
.replace(/>/g, '>')
|
||||||
|
.replace(/"/g, '"')
|
||||||
|
.replace(/'/g, ''');
|
||||||
|
}
|
||||||
|
|
||||||
|
function fallbackCopy(text) {
|
||||||
|
const textArea = document.createElement('textarea');
|
||||||
|
textArea.value = text;
|
||||||
|
textArea.style.position = 'fixed';
|
||||||
|
textArea.style.left = '-9999px';
|
||||||
|
textArea.style.top = '-9999px';
|
||||||
|
document.body.appendChild(textArea);
|
||||||
|
textArea.focus();
|
||||||
|
textArea.select();
|
||||||
|
let success = false;
|
||||||
|
try {
|
||||||
|
success = document.execCommand('copy');
|
||||||
|
} catch {
|
||||||
|
success = false;
|
||||||
|
}
|
||||||
|
document.body.removeChild(textArea);
|
||||||
|
return success;
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
setupJsonAutoIndent: setupJsonAutoIndent,
|
||||||
|
formatBytes: formatBytes,
|
||||||
|
escapeHtml: escapeHtml,
|
||||||
|
fallbackCopy: fallbackCopy
|
||||||
|
};
|
||||||
|
})();
|
||||||
344
static/js/connections-management.js
Normal file
344
static/js/connections-management.js
Normal file
@@ -0,0 +1,344 @@
|
|||||||
|
window.ConnectionsManagement = (function() {
|
||||||
|
'use strict';
|
||||||
|
|
||||||
|
var endpoints = {};
|
||||||
|
var csrfToken = '';
|
||||||
|
|
||||||
|
function init(config) {
|
||||||
|
endpoints = config.endpoints || {};
|
||||||
|
csrfToken = config.csrfToken || '';
|
||||||
|
|
||||||
|
setupEventListeners();
|
||||||
|
checkAllConnectionHealth();
|
||||||
|
}
|
||||||
|
|
||||||
|
function togglePassword(id) {
|
||||||
|
var input = document.getElementById(id);
|
||||||
|
if (input) {
|
||||||
|
input.type = input.type === 'password' ? 'text' : 'password';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function testConnection(formId, resultId) {
|
||||||
|
var form = document.getElementById(formId);
|
||||||
|
var resultDiv = document.getElementById(resultId);
|
||||||
|
if (!form || !resultDiv) return;
|
||||||
|
|
||||||
|
var formData = new FormData(form);
|
||||||
|
var data = {};
|
||||||
|
formData.forEach(function(value, key) {
|
||||||
|
if (key !== 'csrf_token') {
|
||||||
|
data[key] = value;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
resultDiv.innerHTML = '<div class="text-info"><span class="spinner-border spinner-border-sm" role="status" aria-hidden="true"></span> Testing connection...</div>';
|
||||||
|
|
||||||
|
var controller = new AbortController();
|
||||||
|
var timeoutId = setTimeout(function() { controller.abort(); }, 20000);
|
||||||
|
|
||||||
|
try {
|
||||||
|
var response = await fetch(endpoints.test, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
'X-CSRFToken': csrfToken
|
||||||
|
},
|
||||||
|
body: JSON.stringify(data),
|
||||||
|
signal: controller.signal
|
||||||
|
});
|
||||||
|
clearTimeout(timeoutId);
|
||||||
|
|
||||||
|
var result = await response.json();
|
||||||
|
if (response.ok) {
|
||||||
|
resultDiv.innerHTML = '<div class="text-success">' +
|
||||||
|
'<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" class="me-1" viewBox="0 0 16 16">' +
|
||||||
|
'<path d="M16 8A8 8 0 1 1 0 8a8 8 0 0 1 16 0zm-3.97-3.03a.75.75 0 0 0-1.08.022L7.477 9.417 5.384 7.323a.75.75 0 0 0-1.06 1.06L6.97 11.03a.75.75 0 0 0 1.079-.02l3.992-4.99a.75.75 0 0 0-.01-1.05z"/>' +
|
||||||
|
'</svg>' + window.UICore.escapeHtml(result.message) + '</div>';
|
||||||
|
} else {
|
||||||
|
resultDiv.innerHTML = '<div class="text-danger">' +
|
||||||
|
'<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" class="me-1" viewBox="0 0 16 16">' +
|
||||||
|
'<path d="M16 8A8 8 0 1 1 0 8a8 8 0 0 1 16 0zM5.354 4.646a.5.5 0 1 0-.708.708L7.293 8l-2.647 2.646a.5.5 0 0 0 .708.708L8 8.707l2.646 2.647a.5.5 0 0 0 .708-.708L8.707 8l2.647-2.646a.5.5 0 0 0-.708-.708L8 7.293 5.354 4.646z"/>' +
|
||||||
|
'</svg>' + window.UICore.escapeHtml(result.message) + '</div>';
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
clearTimeout(timeoutId);
|
||||||
|
var message = error.name === 'AbortError'
|
||||||
|
? 'Connection test timed out - endpoint may be unreachable'
|
||||||
|
: 'Connection failed: Network error';
|
||||||
|
resultDiv.innerHTML = '<div class="text-danger">' +
|
||||||
|
'<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" class="me-1" viewBox="0 0 16 16">' +
|
||||||
|
'<path d="M16 8A8 8 0 1 1 0 8a8 8 0 0 1 16 0zM5.354 4.646a.5.5 0 1 0-.708.708L7.293 8l-2.647 2.646a.5.5 0 0 0 .708.708L8 8.707l2.646 2.647a.5.5 0 0 0 .708-.708L8.707 8l2.647-2.646a.5.5 0 0 0-.708-.708L8 7.293 5.354 4.646z"/>' +
|
||||||
|
'</svg>' + message + '</div>';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function checkConnectionHealth(connectionId, statusEl) {
|
||||||
|
if (!statusEl) return;
|
||||||
|
|
||||||
|
try {
|
||||||
|
var controller = new AbortController();
|
||||||
|
var timeoutId = setTimeout(function() { controller.abort(); }, 15000);
|
||||||
|
|
||||||
|
var response = await fetch(endpoints.healthTemplate.replace('CONNECTION_ID', connectionId), {
|
||||||
|
signal: controller.signal
|
||||||
|
});
|
||||||
|
clearTimeout(timeoutId);
|
||||||
|
|
||||||
|
var data = await response.json();
|
||||||
|
if (data.healthy) {
|
||||||
|
statusEl.innerHTML = '<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" class="text-success" viewBox="0 0 16 16">' +
|
||||||
|
'<path d="M16 8A8 8 0 1 1 0 8a8 8 0 0 1 16 0zm-3.97-3.03a.75.75 0 0 0-1.08.022L7.477 9.417 5.384 7.323a.75.75 0 0 0-1.06 1.06L6.97 11.03a.75.75 0 0 0 1.079-.02l3.992-4.99a.75.75 0 0 0-.01-1.05z"/></svg>';
|
||||||
|
statusEl.setAttribute('data-status', 'healthy');
|
||||||
|
statusEl.setAttribute('title', 'Connected');
|
||||||
|
} else {
|
||||||
|
statusEl.innerHTML = '<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" class="text-danger" viewBox="0 0 16 16">' +
|
||||||
|
'<path d="M16 8A8 8 0 1 1 0 8a8 8 0 0 1 16 0zM5.354 4.646a.5.5 0 1 0-.708.708L7.293 8l-2.647 2.646a.5.5 0 0 0 .708.708L8 8.707l2.646 2.647a.5.5 0 0 0 .708-.708L8.707 8l2.647-2.646a.5.5 0 0 0-.708-.708L8 7.293 5.354 4.646z"/></svg>';
|
||||||
|
statusEl.setAttribute('data-status', 'unhealthy');
|
||||||
|
statusEl.setAttribute('title', data.error || 'Unreachable');
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
statusEl.innerHTML = '<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" class="text-warning" viewBox="0 0 16 16">' +
|
||||||
|
'<path d="M8.982 1.566a1.13 1.13 0 0 0-1.96 0L.165 13.233c-.457.778.091 1.767.98 1.767h13.713c.889 0 1.438-.99.98-1.767L8.982 1.566zM8 5c.535 0 .954.462.9.995l-.35 3.507a.552.552 0 0 1-1.1 0L7.1 5.995A.905.905 0 0 1 8 5zm.002 6a1 1 0 1 1 0 2 1 1 0 0 1 0-2z"/></svg>';
|
||||||
|
statusEl.setAttribute('data-status', 'unknown');
|
||||||
|
statusEl.setAttribute('title', 'Could not check status');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function checkAllConnectionHealth() {
|
||||||
|
var rows = document.querySelectorAll('tr[data-connection-id]');
|
||||||
|
rows.forEach(function(row, index) {
|
||||||
|
var connectionId = row.getAttribute('data-connection-id');
|
||||||
|
var statusEl = row.querySelector('.connection-status');
|
||||||
|
if (statusEl) {
|
||||||
|
setTimeout(function() {
|
||||||
|
checkConnectionHealth(connectionId, statusEl);
|
||||||
|
}, index * 200);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function updateConnectionCount() {
|
||||||
|
var countBadge = document.querySelector('.badge.bg-primary.bg-opacity-10.text-primary.fs-6');
|
||||||
|
if (countBadge) {
|
||||||
|
var remaining = document.querySelectorAll('tr[data-connection-id]').length;
|
||||||
|
countBadge.textContent = remaining + ' connection' + (remaining !== 1 ? 's' : '');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function createConnectionRowHtml(conn) {
|
||||||
|
var ak = conn.access_key || '';
|
||||||
|
var maskedKey = ak.length > 12 ? ak.slice(0, 8) + '...' + ak.slice(-4) : ak;
|
||||||
|
|
||||||
|
return '<tr data-connection-id="' + window.UICore.escapeHtml(conn.id) + '">' +
|
||||||
|
'<td class="text-center">' +
|
||||||
|
'<span class="connection-status" data-status="checking" title="Checking...">' +
|
||||||
|
'<span class="spinner-border spinner-border-sm text-muted" role="status" style="width: 12px; height: 12px;"></span>' +
|
||||||
|
'</span></td>' +
|
||||||
|
'<td><div class="d-flex align-items-center gap-2">' +
|
||||||
|
'<div class="connection-icon"><svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" viewBox="0 0 16 16">' +
|
||||||
|
'<path d="M4.406 3.342A5.53 5.53 0 0 1 8 2c2.69 0 4.923 2 5.166 4.579C14.758 6.804 16 8.137 16 9.773 16 11.569 14.502 13 12.687 13H3.781C1.708 13 0 11.366 0 9.318c0-1.763 1.266-3.223 2.942-3.593.143-.863.698-1.723 1.464-2.383z"/></svg></div>' +
|
||||||
|
'<span class="fw-medium">' + window.UICore.escapeHtml(conn.name) + '</span>' +
|
||||||
|
'</div></td>' +
|
||||||
|
'<td><span class="text-muted small text-truncate d-inline-block" style="max-width: 200px;" title="' + window.UICore.escapeHtml(conn.endpoint_url) + '">' + window.UICore.escapeHtml(conn.endpoint_url) + '</span></td>' +
|
||||||
|
'<td><span class="badge bg-primary bg-opacity-10 text-primary">' + window.UICore.escapeHtml(conn.region) + '</span></td>' +
|
||||||
|
'<td><code class="small">' + window.UICore.escapeHtml(maskedKey) + '</code></td>' +
|
||||||
|
'<td class="text-end"><div class="btn-group btn-group-sm" role="group">' +
|
||||||
|
'<button type="button" class="btn btn-outline-secondary" data-bs-toggle="modal" data-bs-target="#editConnectionModal" ' +
|
||||||
|
'data-id="' + window.UICore.escapeHtml(conn.id) + '" data-name="' + window.UICore.escapeHtml(conn.name) + '" ' +
|
||||||
|
'data-endpoint="' + window.UICore.escapeHtml(conn.endpoint_url) + '" data-region="' + window.UICore.escapeHtml(conn.region) + '" ' +
|
||||||
|
'data-access="' + window.UICore.escapeHtml(conn.access_key) + '" data-secret="' + window.UICore.escapeHtml(conn.secret_key || '') + '" title="Edit connection">' +
|
||||||
|
'<svg xmlns="http://www.w3.org/2000/svg" width="14" height="14" fill="currentColor" viewBox="0 0 16 16">' +
|
||||||
|
'<path d="M12.146.146a.5.5 0 0 1 .708 0l3 3a.5.5 0 0 1 0 .708l-10 10a.5.5 0 0 1-.168.11l-5 2a.5.5 0 0 1-.65-.65l2-5a.5.5 0 0 1 .11-.168l10-10zM11.207 2.5 13.5 4.793 14.793 3.5 12.5 1.207 11.207 2.5zm1.586 3L10.5 3.207 4 9.707V10h.5a.5.5 0 0 1 .5.5v.5h.5a.5.5 0 0 1 .5.5v.5h.293l6.5-6.5z"/></svg></button>' +
|
||||||
|
'<button type="button" class="btn btn-outline-danger" data-bs-toggle="modal" data-bs-target="#deleteConnectionModal" ' +
|
||||||
|
'data-id="' + window.UICore.escapeHtml(conn.id) + '" data-name="' + window.UICore.escapeHtml(conn.name) + '" title="Delete connection">' +
|
||||||
|
'<svg xmlns="http://www.w3.org/2000/svg" width="14" height="14" fill="currentColor" viewBox="0 0 16 16">' +
|
||||||
|
'<path d="M5.5 5.5A.5.5 0 0 1 6 6v6a.5.5 0 0 1-1 0V6a.5.5 0 0 1 .5-.5zm2.5 0a.5.5 0 0 1 .5.5v6a.5.5 0 0 1-1 0V6a.5.5 0 0 1 .5-.5zm3 .5a.5.5 0 0 0-1 0v6a.5.5 0 0 0 1 0V6z"/>' +
|
||||||
|
'<path fill-rule="evenodd" d="M14.5 3a1 1 0 0 1-1 1H13v9a2 2 0 0 1-2 2H5a2 2 0 0 1-2-2V4h-.5a1 1 0 0 1-1-1V2a1 1 0 0 1 1-1H6a1 1 0 0 1 1-1h2a1 1 0 0 1 1 1h3.5a1 1 0 0 1 1 1v1zM4.118 4 4 4.059V13a1 1 0 0 0 1 1h6a1 1 0 0 0 1-1V4.059L11.882 4H4.118zM2.5 3V2h11v1h-11z"/></svg></button>' +
|
||||||
|
'</div></td></tr>';
|
||||||
|
}
|
||||||
|
|
||||||
|
function setupEventListeners() {
|
||||||
|
var testBtn = document.getElementById('testConnectionBtn');
|
||||||
|
if (testBtn) {
|
||||||
|
testBtn.addEventListener('click', function() {
|
||||||
|
testConnection('createConnectionForm', 'testResult');
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
var editTestBtn = document.getElementById('editTestConnectionBtn');
|
||||||
|
if (editTestBtn) {
|
||||||
|
editTestBtn.addEventListener('click', function() {
|
||||||
|
testConnection('editConnectionForm', 'editTestResult');
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
var editModal = document.getElementById('editConnectionModal');
|
||||||
|
if (editModal) {
|
||||||
|
editModal.addEventListener('show.bs.modal', function(event) {
|
||||||
|
var button = event.relatedTarget;
|
||||||
|
if (!button) return;
|
||||||
|
|
||||||
|
var id = button.getAttribute('data-id');
|
||||||
|
|
||||||
|
document.getElementById('edit_name').value = button.getAttribute('data-name') || '';
|
||||||
|
document.getElementById('edit_endpoint_url').value = button.getAttribute('data-endpoint') || '';
|
||||||
|
document.getElementById('edit_region').value = button.getAttribute('data-region') || '';
|
||||||
|
document.getElementById('edit_access_key').value = button.getAttribute('data-access') || '';
|
||||||
|
document.getElementById('edit_secret_key').value = button.getAttribute('data-secret') || '';
|
||||||
|
document.getElementById('editTestResult').innerHTML = '';
|
||||||
|
|
||||||
|
var form = document.getElementById('editConnectionForm');
|
||||||
|
form.action = endpoints.updateTemplate.replace('CONNECTION_ID', id);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
var deleteModal = document.getElementById('deleteConnectionModal');
|
||||||
|
if (deleteModal) {
|
||||||
|
deleteModal.addEventListener('show.bs.modal', function(event) {
|
||||||
|
var button = event.relatedTarget;
|
||||||
|
if (!button) return;
|
||||||
|
|
||||||
|
var id = button.getAttribute('data-id');
|
||||||
|
var name = button.getAttribute('data-name');
|
||||||
|
|
||||||
|
document.getElementById('deleteConnectionName').textContent = name;
|
||||||
|
var form = document.getElementById('deleteConnectionForm');
|
||||||
|
form.action = endpoints.deleteTemplate.replace('CONNECTION_ID', id);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
var createForm = document.getElementById('createConnectionForm');
|
||||||
|
if (createForm) {
|
||||||
|
createForm.addEventListener('submit', function(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
window.UICore.submitFormAjax(createForm, {
|
||||||
|
successMessage: 'Connection created',
|
||||||
|
onSuccess: function(data) {
|
||||||
|
createForm.reset();
|
||||||
|
document.getElementById('testResult').innerHTML = '';
|
||||||
|
|
||||||
|
if (data.connection) {
|
||||||
|
var emptyState = document.querySelector('.empty-state');
|
||||||
|
if (emptyState) {
|
||||||
|
var cardBody = emptyState.closest('.card-body');
|
||||||
|
if (cardBody) {
|
||||||
|
cardBody.innerHTML = '<div class="table-responsive"><table class="table table-hover align-middle mb-0">' +
|
||||||
|
'<thead class="table-light"><tr>' +
|
||||||
|
'<th scope="col" style="width: 50px;">Status</th>' +
|
||||||
|
'<th scope="col">Name</th><th scope="col">Endpoint</th>' +
|
||||||
|
'<th scope="col">Region</th><th scope="col">Access Key</th>' +
|
||||||
|
'<th scope="col" class="text-end">Actions</th></tr></thead>' +
|
||||||
|
'<tbody></tbody></table></div>';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
var tbody = document.querySelector('table tbody');
|
||||||
|
if (tbody) {
|
||||||
|
tbody.insertAdjacentHTML('beforeend', createConnectionRowHtml(data.connection));
|
||||||
|
var newRow = tbody.lastElementChild;
|
||||||
|
var statusEl = newRow.querySelector('.connection-status');
|
||||||
|
if (statusEl) {
|
||||||
|
checkConnectionHealth(data.connection.id, statusEl);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
updateConnectionCount();
|
||||||
|
} else {
|
||||||
|
location.reload();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
var editForm = document.getElementById('editConnectionForm');
|
||||||
|
if (editForm) {
|
||||||
|
editForm.addEventListener('submit', function(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
window.UICore.submitFormAjax(editForm, {
|
||||||
|
successMessage: 'Connection updated',
|
||||||
|
onSuccess: function(data) {
|
||||||
|
var modal = bootstrap.Modal.getInstance(document.getElementById('editConnectionModal'));
|
||||||
|
if (modal) modal.hide();
|
||||||
|
|
||||||
|
var connId = editForm.action.split('/').slice(-2)[0];
|
||||||
|
var row = document.querySelector('tr[data-connection-id="' + connId + '"]');
|
||||||
|
if (row && data.connection) {
|
||||||
|
var nameCell = row.querySelector('.fw-medium');
|
||||||
|
if (nameCell) nameCell.textContent = data.connection.name;
|
||||||
|
|
||||||
|
var endpointCell = row.querySelector('.text-truncate');
|
||||||
|
if (endpointCell) {
|
||||||
|
endpointCell.textContent = data.connection.endpoint_url;
|
||||||
|
endpointCell.title = data.connection.endpoint_url;
|
||||||
|
}
|
||||||
|
|
||||||
|
var regionBadge = row.querySelector('.badge.bg-primary');
|
||||||
|
if (regionBadge) regionBadge.textContent = data.connection.region;
|
||||||
|
|
||||||
|
var accessCode = row.querySelector('code.small');
|
||||||
|
if (accessCode && data.connection.access_key) {
|
||||||
|
var ak = data.connection.access_key;
|
||||||
|
accessCode.textContent = ak.slice(0, 8) + '...' + ak.slice(-4);
|
||||||
|
}
|
||||||
|
|
||||||
|
var editBtn = row.querySelector('[data-bs-target="#editConnectionModal"]');
|
||||||
|
if (editBtn) {
|
||||||
|
editBtn.setAttribute('data-name', data.connection.name);
|
||||||
|
editBtn.setAttribute('data-endpoint', data.connection.endpoint_url);
|
||||||
|
editBtn.setAttribute('data-region', data.connection.region);
|
||||||
|
editBtn.setAttribute('data-access', data.connection.access_key);
|
||||||
|
if (data.connection.secret_key) {
|
||||||
|
editBtn.setAttribute('data-secret', data.connection.secret_key);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
var deleteBtn = row.querySelector('[data-bs-target="#deleteConnectionModal"]');
|
||||||
|
if (deleteBtn) {
|
||||||
|
deleteBtn.setAttribute('data-name', data.connection.name);
|
||||||
|
}
|
||||||
|
|
||||||
|
var statusEl = row.querySelector('.connection-status');
|
||||||
|
if (statusEl) {
|
||||||
|
checkConnectionHealth(connId, statusEl);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
var deleteForm = document.getElementById('deleteConnectionForm');
|
||||||
|
if (deleteForm) {
|
||||||
|
deleteForm.addEventListener('submit', function(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
window.UICore.submitFormAjax(deleteForm, {
|
||||||
|
successMessage: 'Connection deleted',
|
||||||
|
onSuccess: function(data) {
|
||||||
|
var modal = bootstrap.Modal.getInstance(document.getElementById('deleteConnectionModal'));
|
||||||
|
if (modal) modal.hide();
|
||||||
|
|
||||||
|
var connId = deleteForm.action.split('/').slice(-2)[0];
|
||||||
|
var row = document.querySelector('tr[data-connection-id="' + connId + '"]');
|
||||||
|
if (row) {
|
||||||
|
row.remove();
|
||||||
|
}
|
||||||
|
|
||||||
|
updateConnectionCount();
|
||||||
|
|
||||||
|
if (document.querySelectorAll('tr[data-connection-id]').length === 0) {
|
||||||
|
location.reload();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
init: init,
|
||||||
|
togglePassword: togglePassword,
|
||||||
|
testConnection: testConnection,
|
||||||
|
checkConnectionHealth: checkConnectionHealth
|
||||||
|
};
|
||||||
|
})();
|
||||||
545
static/js/iam-management.js
Normal file
545
static/js/iam-management.js
Normal file
@@ -0,0 +1,545 @@
|
|||||||
|
window.IAMManagement = (function() {
|
||||||
|
'use strict';
|
||||||
|
|
||||||
|
var users = [];
|
||||||
|
var currentUserKey = null;
|
||||||
|
var endpoints = {};
|
||||||
|
var csrfToken = '';
|
||||||
|
var iamLocked = false;
|
||||||
|
|
||||||
|
var policyModal = null;
|
||||||
|
var editUserModal = null;
|
||||||
|
var deleteUserModal = null;
|
||||||
|
var rotateSecretModal = null;
|
||||||
|
var currentRotateKey = null;
|
||||||
|
var currentEditKey = null;
|
||||||
|
var currentDeleteKey = null;
|
||||||
|
|
||||||
|
var policyTemplates = {
|
||||||
|
full: [{ bucket: '*', actions: ['list', 'read', 'write', 'delete', 'share', 'policy', 'replication', 'iam:list_users', 'iam:*'] }],
|
||||||
|
readonly: [{ bucket: '*', actions: ['list', 'read'] }],
|
||||||
|
writer: [{ bucket: '*', actions: ['list', 'read', 'write'] }]
|
||||||
|
};
|
||||||
|
|
||||||
|
function init(config) {
|
||||||
|
users = config.users || [];
|
||||||
|
currentUserKey = config.currentUserKey || null;
|
||||||
|
endpoints = config.endpoints || {};
|
||||||
|
csrfToken = config.csrfToken || '';
|
||||||
|
iamLocked = config.iamLocked || false;
|
||||||
|
|
||||||
|
if (iamLocked) return;
|
||||||
|
|
||||||
|
initModals();
|
||||||
|
setupJsonAutoIndent();
|
||||||
|
setupCopyButtons();
|
||||||
|
setupPolicyEditor();
|
||||||
|
setupCreateUserModal();
|
||||||
|
setupEditUserModal();
|
||||||
|
setupDeleteUserModal();
|
||||||
|
setupRotateSecretModal();
|
||||||
|
setupFormHandlers();
|
||||||
|
}
|
||||||
|
|
||||||
|
function initModals() {
|
||||||
|
var policyModalEl = document.getElementById('policyEditorModal');
|
||||||
|
var editModalEl = document.getElementById('editUserModal');
|
||||||
|
var deleteModalEl = document.getElementById('deleteUserModal');
|
||||||
|
var rotateModalEl = document.getElementById('rotateSecretModal');
|
||||||
|
|
||||||
|
if (policyModalEl) policyModal = new bootstrap.Modal(policyModalEl);
|
||||||
|
if (editModalEl) editUserModal = new bootstrap.Modal(editModalEl);
|
||||||
|
if (deleteModalEl) deleteUserModal = new bootstrap.Modal(deleteModalEl);
|
||||||
|
if (rotateModalEl) rotateSecretModal = new bootstrap.Modal(rotateModalEl);
|
||||||
|
}
|
||||||
|
|
||||||
|
function setupJsonAutoIndent() {
|
||||||
|
window.UICore.setupJsonAutoIndent(document.getElementById('policyEditorDocument'));
|
||||||
|
window.UICore.setupJsonAutoIndent(document.getElementById('createUserPolicies'));
|
||||||
|
}
|
||||||
|
|
||||||
|
function setupCopyButtons() {
|
||||||
|
document.querySelectorAll('.config-copy').forEach(function(button) {
|
||||||
|
button.addEventListener('click', async function() {
|
||||||
|
var targetId = button.dataset.copyTarget;
|
||||||
|
var target = document.getElementById(targetId);
|
||||||
|
if (!target) return;
|
||||||
|
await window.UICore.copyToClipboard(target.innerText, button, 'Copy JSON');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
var secretCopyButton = document.querySelector('[data-secret-copy]');
|
||||||
|
if (secretCopyButton) {
|
||||||
|
secretCopyButton.addEventListener('click', async function() {
|
||||||
|
var secretInput = document.getElementById('disclosedSecretValue');
|
||||||
|
if (!secretInput) return;
|
||||||
|
await window.UICore.copyToClipboard(secretInput.value, secretCopyButton, 'Copy');
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function getUserPolicies(accessKey) {
|
||||||
|
var user = users.find(function(u) { return u.access_key === accessKey; });
|
||||||
|
return user ? JSON.stringify(user.policies, null, 2) : '';
|
||||||
|
}
|
||||||
|
|
||||||
|
function applyPolicyTemplate(name, textareaEl) {
|
||||||
|
if (policyTemplates[name] && textareaEl) {
|
||||||
|
textareaEl.value = JSON.stringify(policyTemplates[name], null, 2);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function setupPolicyEditor() {
|
||||||
|
var userLabelEl = document.getElementById('policyEditorUserLabel');
|
||||||
|
var userInputEl = document.getElementById('policyEditorUser');
|
||||||
|
var textareaEl = document.getElementById('policyEditorDocument');
|
||||||
|
|
||||||
|
document.querySelectorAll('[data-policy-template]').forEach(function(button) {
|
||||||
|
button.addEventListener('click', function() {
|
||||||
|
applyPolicyTemplate(button.dataset.policyTemplate, textareaEl);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
document.querySelectorAll('[data-policy-editor]').forEach(function(button) {
|
||||||
|
button.addEventListener('click', function() {
|
||||||
|
var key = button.getAttribute('data-access-key');
|
||||||
|
if (!key) return;
|
||||||
|
|
||||||
|
userLabelEl.textContent = key;
|
||||||
|
userInputEl.value = key;
|
||||||
|
textareaEl.value = getUserPolicies(key);
|
||||||
|
|
||||||
|
policyModal.show();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function setupCreateUserModal() {
|
||||||
|
var createUserPoliciesEl = document.getElementById('createUserPolicies');
|
||||||
|
|
||||||
|
document.querySelectorAll('[data-create-policy-template]').forEach(function(button) {
|
||||||
|
button.addEventListener('click', function() {
|
||||||
|
applyPolicyTemplate(button.dataset.createPolicyTemplate, createUserPoliciesEl);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function setupEditUserModal() {
|
||||||
|
var editUserForm = document.getElementById('editUserForm');
|
||||||
|
var editUserDisplayName = document.getElementById('editUserDisplayName');
|
||||||
|
|
||||||
|
document.querySelectorAll('[data-edit-user]').forEach(function(btn) {
|
||||||
|
btn.addEventListener('click', function() {
|
||||||
|
var key = btn.dataset.editUser;
|
||||||
|
var name = btn.dataset.displayName;
|
||||||
|
currentEditKey = key;
|
||||||
|
editUserDisplayName.value = name;
|
||||||
|
editUserForm.action = endpoints.updateUser.replace('ACCESS_KEY', key);
|
||||||
|
editUserModal.show();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function setupDeleteUserModal() {
|
||||||
|
var deleteUserForm = document.getElementById('deleteUserForm');
|
||||||
|
var deleteUserLabel = document.getElementById('deleteUserLabel');
|
||||||
|
var deleteSelfWarning = document.getElementById('deleteSelfWarning');
|
||||||
|
|
||||||
|
document.querySelectorAll('[data-delete-user]').forEach(function(btn) {
|
||||||
|
btn.addEventListener('click', function() {
|
||||||
|
var key = btn.dataset.deleteUser;
|
||||||
|
currentDeleteKey = key;
|
||||||
|
deleteUserLabel.textContent = key;
|
||||||
|
deleteUserForm.action = endpoints.deleteUser.replace('ACCESS_KEY', key);
|
||||||
|
|
||||||
|
if (key === currentUserKey) {
|
||||||
|
deleteSelfWarning.classList.remove('d-none');
|
||||||
|
} else {
|
||||||
|
deleteSelfWarning.classList.add('d-none');
|
||||||
|
}
|
||||||
|
|
||||||
|
deleteUserModal.show();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function setupRotateSecretModal() {
|
||||||
|
var rotateUserLabel = document.getElementById('rotateUserLabel');
|
||||||
|
var confirmRotateBtn = document.getElementById('confirmRotateBtn');
|
||||||
|
var rotateCancelBtn = document.getElementById('rotateCancelBtn');
|
||||||
|
var rotateDoneBtn = document.getElementById('rotateDoneBtn');
|
||||||
|
var rotateSecretConfirm = document.getElementById('rotateSecretConfirm');
|
||||||
|
var rotateSecretResult = document.getElementById('rotateSecretResult');
|
||||||
|
var newSecretKeyInput = document.getElementById('newSecretKey');
|
||||||
|
var copyNewSecretBtn = document.getElementById('copyNewSecret');
|
||||||
|
|
||||||
|
document.querySelectorAll('[data-rotate-user]').forEach(function(btn) {
|
||||||
|
btn.addEventListener('click', function() {
|
||||||
|
currentRotateKey = btn.dataset.rotateUser;
|
||||||
|
rotateUserLabel.textContent = currentRotateKey;
|
||||||
|
|
||||||
|
rotateSecretConfirm.classList.remove('d-none');
|
||||||
|
rotateSecretResult.classList.add('d-none');
|
||||||
|
confirmRotateBtn.classList.remove('d-none');
|
||||||
|
rotateCancelBtn.classList.remove('d-none');
|
||||||
|
rotateDoneBtn.classList.add('d-none');
|
||||||
|
|
||||||
|
rotateSecretModal.show();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
if (confirmRotateBtn) {
|
||||||
|
confirmRotateBtn.addEventListener('click', async function() {
|
||||||
|
if (!currentRotateKey) return;
|
||||||
|
|
||||||
|
window.UICore.setButtonLoading(confirmRotateBtn, true, 'Rotating...');
|
||||||
|
|
||||||
|
try {
|
||||||
|
var url = endpoints.rotateSecret.replace('ACCESS_KEY', currentRotateKey);
|
||||||
|
var response = await fetch(url, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Accept': 'application/json',
|
||||||
|
'X-CSRFToken': csrfToken
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
var data = await response.json();
|
||||||
|
throw new Error(data.error || 'Failed to rotate secret');
|
||||||
|
}
|
||||||
|
|
||||||
|
var data = await response.json();
|
||||||
|
newSecretKeyInput.value = data.secret_key;
|
||||||
|
|
||||||
|
rotateSecretConfirm.classList.add('d-none');
|
||||||
|
rotateSecretResult.classList.remove('d-none');
|
||||||
|
confirmRotateBtn.classList.add('d-none');
|
||||||
|
rotateCancelBtn.classList.add('d-none');
|
||||||
|
rotateDoneBtn.classList.remove('d-none');
|
||||||
|
|
||||||
|
} catch (err) {
|
||||||
|
if (window.showToast) {
|
||||||
|
window.showToast(err.message, 'Error', 'danger');
|
||||||
|
}
|
||||||
|
rotateSecretModal.hide();
|
||||||
|
} finally {
|
||||||
|
window.UICore.setButtonLoading(confirmRotateBtn, false);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (copyNewSecretBtn) {
|
||||||
|
copyNewSecretBtn.addEventListener('click', async function() {
|
||||||
|
await window.UICore.copyToClipboard(newSecretKeyInput.value, copyNewSecretBtn, 'Copy');
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (rotateDoneBtn) {
|
||||||
|
rotateDoneBtn.addEventListener('click', function() {
|
||||||
|
window.location.reload();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function createUserCardHtml(accessKey, displayName, policies) {
|
||||||
|
var policyBadges = '';
|
||||||
|
if (policies && policies.length > 0) {
|
||||||
|
policyBadges = policies.map(function(p) {
|
||||||
|
var actionText = p.actions && p.actions.includes('*') ? 'full' : (p.actions ? p.actions.length : 0);
|
||||||
|
return '<span class="badge bg-primary bg-opacity-10 text-primary">' +
|
||||||
|
'<svg xmlns="http://www.w3.org/2000/svg" width="10" height="10" fill="currentColor" class="me-1" viewBox="0 0 16 16">' +
|
||||||
|
'<path d="M2.522 5H2a.5.5 0 0 0-.494.574l1.372 9.149A1.5 1.5 0 0 0 4.36 16h7.278a1.5 1.5 0 0 0 1.483-1.277l1.373-9.149A.5.5 0 0 0 14 5h-.522A5.5 5.5 0 0 0 2.522 5zm1.005 0a4.5 4.5 0 0 1 8.945 0H3.527z"/>' +
|
||||||
|
'</svg>' + window.UICore.escapeHtml(p.bucket) +
|
||||||
|
'<span class="opacity-75">(' + actionText + ')</span></span>';
|
||||||
|
}).join('');
|
||||||
|
} else {
|
||||||
|
policyBadges = '<span class="badge bg-secondary bg-opacity-10 text-secondary">No policies</span>';
|
||||||
|
}
|
||||||
|
|
||||||
|
return '<div class="col-md-6 col-xl-4">' +
|
||||||
|
'<div class="card h-100 iam-user-card">' +
|
||||||
|
'<div class="card-body">' +
|
||||||
|
'<div class="d-flex align-items-start justify-content-between mb-3">' +
|
||||||
|
'<div class="d-flex align-items-center gap-3 min-width-0 overflow-hidden">' +
|
||||||
|
'<div class="user-avatar user-avatar-lg flex-shrink-0">' +
|
||||||
|
'<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" fill="currentColor" viewBox="0 0 16 16">' +
|
||||||
|
'<path d="M8 8a3 3 0 1 0 0-6 3 3 0 0 0 0 6zm2-3a2 2 0 1 1-4 0 2 2 0 0 1 4 0zm4 8c0 1-1 1-1 1H3s-1 0-1-1 1-4 6-4 6 3 6 4zm-1-.004c-.001-.246-.154-.986-.832-1.664C11.516 10.68 10.289 10 8 10c-2.29 0-3.516.68-4.168 1.332-.678.678-.83 1.418-.832 1.664h10z"/>' +
|
||||||
|
'</svg></div>' +
|
||||||
|
'<div class="min-width-0">' +
|
||||||
|
'<h6 class="fw-semibold mb-0 text-truncate" title="' + window.UICore.escapeHtml(displayName) + '">' + window.UICore.escapeHtml(displayName) + '</h6>' +
|
||||||
|
'<code class="small text-muted d-block text-truncate" title="' + window.UICore.escapeHtml(accessKey) + '">' + window.UICore.escapeHtml(accessKey) + '</code>' +
|
||||||
|
'</div></div>' +
|
||||||
|
'<div class="dropdown flex-shrink-0">' +
|
||||||
|
'<button class="btn btn-sm btn-icon" type="button" data-bs-toggle="dropdown" aria-expanded="false">' +
|
||||||
|
'<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" viewBox="0 0 16 16">' +
|
||||||
|
'<path d="M9.5 13a1.5 1.5 0 1 1-3 0 1.5 1.5 0 0 1 3 0zm0-5a1.5 1.5 0 1 1-3 0 1.5 1.5 0 0 1 3 0zm0-5a1.5 1.5 0 1 1-3 0 1.5 1.5 0 0 1 3 0z"/>' +
|
||||||
|
'</svg></button>' +
|
||||||
|
'<ul class="dropdown-menu dropdown-menu-end">' +
|
||||||
|
'<li><button class="dropdown-item" type="button" data-edit-user="' + window.UICore.escapeHtml(accessKey) + '" data-display-name="' + window.UICore.escapeHtml(displayName) + '">' +
|
||||||
|
'<svg xmlns="http://www.w3.org/2000/svg" width="14" height="14" fill="currentColor" class="me-2" viewBox="0 0 16 16"><path d="M12.146.146a.5.5 0 0 1 .708 0l3 3a.5.5 0 0 1 0 .708l-10 10a.5.5 0 0 1-.168.11l-5 2a.5.5 0 0 1-.65-.65l2-5a.5.5 0 0 1 .11-.168l10-10zM11.207 2.5 13.5 4.793 14.793 3.5 12.5 1.207 11.207 2.5zm1.586 3L10.5 3.207 4 9.707V10h.5a.5.5 0 0 1 .5.5v.5h.5a.5.5 0 0 1 .5.5v.5h.293l6.5-6.5z"/></svg>Edit Name</button></li>' +
|
||||||
|
'<li><button class="dropdown-item" type="button" data-rotate-user="' + window.UICore.escapeHtml(accessKey) + '">' +
|
||||||
|
'<svg xmlns="http://www.w3.org/2000/svg" width="14" height="14" fill="currentColor" class="me-2" viewBox="0 0 16 16"><path d="M11.534 7h3.932a.25.25 0 0 1 .192.41l-1.966 2.36a.25.25 0 0 1-.384 0l-1.966-2.36a.25.25 0 0 1 .192-.41zm-11 2h3.932a.25.25 0 0 0 .192-.41L2.692 6.23a.25.25 0 0 0-.384 0L.342 8.59A.25.25 0 0 0 .534 9z"/><path fill-rule="evenodd" d="M8 3c-1.552 0-2.94.707-3.857 1.818a.5.5 0 1 1-.771-.636A6.002 6.002 0 0 1 13.917 7H12.9A5.002 5.002 0 0 0 8 3zM3.1 9a5.002 5.002 0 0 0 8.757 2.182.5.5 0 1 1 .771.636A6.002 6.002 0 0 1 2.083 9H3.1z"/></svg>Rotate Secret</button></li>' +
|
||||||
|
'<li><hr class="dropdown-divider"></li>' +
|
||||||
|
'<li><button class="dropdown-item text-danger" type="button" data-delete-user="' + window.UICore.escapeHtml(accessKey) + '">' +
|
||||||
|
'<svg xmlns="http://www.w3.org/2000/svg" width="14" height="14" fill="currentColor" class="me-2" viewBox="0 0 16 16"><path d="M5.5 5.5a.5.5 0 0 1 .5.5v6a.5.5 0 0 1-1 0v-6a.5.5 0 0 1 .5-.5zm2.5 0a.5.5 0 0 1 .5.5v6a.5.5 0 0 1-1 0v-6a.5.5 0 0 1 .5-.5zm3 .5v6a.5.5 0 0 1-1 0v-6a.5.5 0 0 1 1 0z"/><path fill-rule="evenodd" d="M14.5 3a1 1 0 0 1-1 1H13v9a2 2 0 0 1-2 2H5a2 2 0 0 1-2-2V4h-.5a1 1 0 0 1-1-1V2a1 1 0 0 1 1-1H6a1 1 0 0 1 1-1h2a1 1 0 0 1 1 1h3.5a1 1 0 0 1 1 1v1zM4.118 4 4 4.059V13a1 1 0 0 0 1 1h6a1 1 0 0 0 1-1V4.059L11.882 4H4.118zM2.5 3V2h11v1h-11z"/></svg>Delete User</button></li>' +
|
||||||
|
'</ul></div></div>' +
|
||||||
|
'<div class="mb-3">' +
|
||||||
|
'<div class="small text-muted mb-2">Bucket Permissions</div>' +
|
||||||
|
'<div class="d-flex flex-wrap gap-1">' + policyBadges + '</div></div>' +
|
||||||
|
'<button class="btn btn-outline-primary btn-sm w-100" type="button" data-policy-editor data-access-key="' + window.UICore.escapeHtml(accessKey) + '">' +
|
||||||
|
'<svg xmlns="http://www.w3.org/2000/svg" width="14" height="14" fill="currentColor" class="me-1" viewBox="0 0 16 16"><path d="M8 4.754a3.246 3.246 0 1 0 0 6.492 3.246 3.246 0 0 0 0-6.492zM5.754 8a2.246 2.246 0 1 1 4.492 0 2.246 2.246 0 0 1-4.492 0z"/><path d="M9.796 1.343c-.527-1.79-3.065-1.79-3.592 0l-.094.319a.873.873 0 0 1-1.255.52l-.292-.16c-1.64-.892-3.433.902-2.54 2.541l.159.292a.873.873 0 0 1-.52 1.255l-.319.094c-1.79.527-1.79 3.065 0 3.592l.319.094a.873.873 0 0 1 .52 1.255l-.16.292c-.892 1.64.901 3.434 2.541 2.54l.292-.159a.873.873 0 0 1 1.255.52l.094.319c.527 1.79 3.065 1.79 3.592 0l.094-.319a.873.873 0 0 1 1.255-.52l.292.16c1.64.893 3.434-.902 2.54-2.541l-.159-.292a.873.873 0 0 1 .52-1.255l.319-.094c1.79-.527 1.79-3.065 0-3.592l-.319-.094a.873.873 0 0 1-.52-1.255l.16-.292c.893-1.64-.902-3.433-2.541-2.54l-.292.159a.873.873 0 0 1-1.255-.52l-.094-.319z"/></svg>Manage Policies</button>' +
|
||||||
|
'</div></div></div>';
|
||||||
|
}
|
||||||
|
|
||||||
|
function attachUserCardHandlers(cardElement, accessKey, displayName) {
|
||||||
|
var editBtn = cardElement.querySelector('[data-edit-user]');
|
||||||
|
if (editBtn) {
|
||||||
|
editBtn.addEventListener('click', function() {
|
||||||
|
currentEditKey = accessKey;
|
||||||
|
document.getElementById('editUserDisplayName').value = displayName;
|
||||||
|
document.getElementById('editUserForm').action = endpoints.updateUser.replace('ACCESS_KEY', accessKey);
|
||||||
|
editUserModal.show();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
var deleteBtn = cardElement.querySelector('[data-delete-user]');
|
||||||
|
if (deleteBtn) {
|
||||||
|
deleteBtn.addEventListener('click', function() {
|
||||||
|
currentDeleteKey = accessKey;
|
||||||
|
document.getElementById('deleteUserLabel').textContent = accessKey;
|
||||||
|
document.getElementById('deleteUserForm').action = endpoints.deleteUser.replace('ACCESS_KEY', accessKey);
|
||||||
|
var deleteSelfWarning = document.getElementById('deleteSelfWarning');
|
||||||
|
if (accessKey === currentUserKey) {
|
||||||
|
deleteSelfWarning.classList.remove('d-none');
|
||||||
|
} else {
|
||||||
|
deleteSelfWarning.classList.add('d-none');
|
||||||
|
}
|
||||||
|
deleteUserModal.show();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
var rotateBtn = cardElement.querySelector('[data-rotate-user]');
|
||||||
|
if (rotateBtn) {
|
||||||
|
rotateBtn.addEventListener('click', function() {
|
||||||
|
currentRotateKey = accessKey;
|
||||||
|
document.getElementById('rotateUserLabel').textContent = accessKey;
|
||||||
|
document.getElementById('rotateSecretConfirm').classList.remove('d-none');
|
||||||
|
document.getElementById('rotateSecretResult').classList.add('d-none');
|
||||||
|
document.getElementById('confirmRotateBtn').classList.remove('d-none');
|
||||||
|
document.getElementById('rotateCancelBtn').classList.remove('d-none');
|
||||||
|
document.getElementById('rotateDoneBtn').classList.add('d-none');
|
||||||
|
rotateSecretModal.show();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
var policyBtn = cardElement.querySelector('[data-policy-editor]');
|
||||||
|
if (policyBtn) {
|
||||||
|
policyBtn.addEventListener('click', function() {
|
||||||
|
document.getElementById('policyEditorUserLabel').textContent = accessKey;
|
||||||
|
document.getElementById('policyEditorUser').value = accessKey;
|
||||||
|
document.getElementById('policyEditorDocument').value = getUserPolicies(accessKey);
|
||||||
|
policyModal.show();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function updateUserCount() {
|
||||||
|
var countEl = document.querySelector('.card-header .text-muted.small');
|
||||||
|
if (countEl) {
|
||||||
|
var count = document.querySelectorAll('.iam-user-card').length;
|
||||||
|
countEl.textContent = count + ' user' + (count !== 1 ? 's' : '') + ' configured';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function setupFormHandlers() {
|
||||||
|
var createUserForm = document.querySelector('#createUserModal form');
|
||||||
|
if (createUserForm) {
|
||||||
|
createUserForm.addEventListener('submit', function(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
window.UICore.submitFormAjax(createUserForm, {
|
||||||
|
successMessage: 'User created',
|
||||||
|
onSuccess: function(data) {
|
||||||
|
var modal = bootstrap.Modal.getInstance(document.getElementById('createUserModal'));
|
||||||
|
if (modal) modal.hide();
|
||||||
|
createUserForm.reset();
|
||||||
|
|
||||||
|
var existingAlert = document.querySelector('.alert.alert-info.border-0.shadow-sm');
|
||||||
|
if (existingAlert) existingAlert.remove();
|
||||||
|
|
||||||
|
if (data.secret_key) {
|
||||||
|
var alertHtml = '<div class="alert alert-info border-0 shadow-sm mb-4" role="alert" id="newUserSecretAlert">' +
|
||||||
|
'<div class="d-flex align-items-start gap-2 mb-2">' +
|
||||||
|
'<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" fill="currentColor" class="bi bi-key flex-shrink-0 mt-1" viewBox="0 0 16 16">' +
|
||||||
|
'<path d="M0 8a4 4 0 0 1 7.465-2H14a.5.5 0 0 1 .354.146l1.5 1.5a.5.5 0 0 1 0 .708l-1.5 1.5a.5.5 0 0 1-.708 0L13 9.207l-.646.647a.5.5 0 0 1-.708 0L11 9.207l-.646.647a.5.5 0 0 1-.708 0L9 9.207l-.646.647A.5.5 0 0 1 8 10h-.535A4 4 0 0 1 0 8zm4-3a3 3 0 1 0 2.712 4.285A.5.5 0 0 1 7.163 9h.63l.853-.854a.5.5 0 0 1 .708 0l.646.647.646-.647a.5.5 0 0 1 .708 0l.646.647.646-.647a.5.5 0 0 1 .708 0l.646.647.793-.793-1-1h-6.63a.5.5 0 0 1-.451-.285A3 3 0 0 0 4 5z"/><path d="M4 8a1 1 0 1 1-2 0 1 1 0 0 1 2 0z"/>' +
|
||||||
|
'</svg>' +
|
||||||
|
'<div class="flex-grow-1">' +
|
||||||
|
'<div class="fw-semibold">New user created: <code>' + window.UICore.escapeHtml(data.access_key) + '</code></div>' +
|
||||||
|
'<p class="mb-2 small">This secret is only shown once. Copy it now and store it securely.</p>' +
|
||||||
|
'</div>' +
|
||||||
|
'<button type="button" class="btn-close" data-bs-dismiss="alert" aria-label="Close"></button>' +
|
||||||
|
'</div>' +
|
||||||
|
'<div class="input-group">' +
|
||||||
|
'<span class="input-group-text"><strong>Secret key</strong></span>' +
|
||||||
|
'<input class="form-control font-monospace" type="text" value="' + window.UICore.escapeHtml(data.secret_key) + '" readonly id="newUserSecret" />' +
|
||||||
|
'<button class="btn btn-outline-primary" type="button" id="copyNewUserSecret">Copy</button>' +
|
||||||
|
'</div></div>';
|
||||||
|
var container = document.querySelector('.page-header');
|
||||||
|
if (container) {
|
||||||
|
container.insertAdjacentHTML('afterend', alertHtml);
|
||||||
|
document.getElementById('copyNewUserSecret').addEventListener('click', async function() {
|
||||||
|
await window.UICore.copyToClipboard(data.secret_key, this, 'Copy');
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
var usersGrid = document.querySelector('.row.g-3');
|
||||||
|
var emptyState = document.querySelector('.empty-state');
|
||||||
|
if (emptyState) {
|
||||||
|
var emptyCol = emptyState.closest('.col-12');
|
||||||
|
if (emptyCol) emptyCol.remove();
|
||||||
|
if (!usersGrid) {
|
||||||
|
var cardBody = document.querySelector('.card-body.px-4.pb-4');
|
||||||
|
if (cardBody) {
|
||||||
|
cardBody.innerHTML = '<div class="row g-3"></div>';
|
||||||
|
usersGrid = cardBody.querySelector('.row.g-3');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (usersGrid) {
|
||||||
|
var cardHtml = createUserCardHtml(data.access_key, data.display_name, data.policies);
|
||||||
|
usersGrid.insertAdjacentHTML('beforeend', cardHtml);
|
||||||
|
var newCard = usersGrid.lastElementChild;
|
||||||
|
attachUserCardHandlers(newCard, data.access_key, data.display_name);
|
||||||
|
users.push({
|
||||||
|
access_key: data.access_key,
|
||||||
|
display_name: data.display_name,
|
||||||
|
policies: data.policies || []
|
||||||
|
});
|
||||||
|
updateUserCount();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
var policyEditorForm = document.getElementById('policyEditorForm');
|
||||||
|
if (policyEditorForm) {
|
||||||
|
policyEditorForm.addEventListener('submit', function(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
var userInputEl = document.getElementById('policyEditorUser');
|
||||||
|
var key = userInputEl.value;
|
||||||
|
if (!key) return;
|
||||||
|
|
||||||
|
var template = policyEditorForm.dataset.actionTemplate;
|
||||||
|
policyEditorForm.action = template.replace('ACCESS_KEY_PLACEHOLDER', key);
|
||||||
|
|
||||||
|
window.UICore.submitFormAjax(policyEditorForm, {
|
||||||
|
successMessage: 'Policies updated',
|
||||||
|
onSuccess: function(data) {
|
||||||
|
policyModal.hide();
|
||||||
|
|
||||||
|
var userCard = document.querySelector('[data-access-key="' + key + '"]');
|
||||||
|
if (userCard) {
|
||||||
|
var badgeContainer = userCard.closest('.iam-user-card').querySelector('.d-flex.flex-wrap.gap-1');
|
||||||
|
if (badgeContainer && data.policies) {
|
||||||
|
var badges = data.policies.map(function(p) {
|
||||||
|
return '<span class="badge bg-primary bg-opacity-10 text-primary">' +
|
||||||
|
'<svg xmlns="http://www.w3.org/2000/svg" width="10" height="10" fill="currentColor" class="me-1" viewBox="0 0 16 16">' +
|
||||||
|
'<path d="M2.522 5H2a.5.5 0 0 0-.494.574l1.372 9.149A1.5 1.5 0 0 0 4.36 16h7.278a1.5 1.5 0 0 0 1.483-1.277l1.373-9.149A.5.5 0 0 0 14 5h-.522A5.5 5.5 0 0 0 2.522 5zm1.005 0a4.5 4.5 0 0 1 8.945 0H3.527z"/>' +
|
||||||
|
'</svg>' + window.UICore.escapeHtml(p.bucket) +
|
||||||
|
'<span class="opacity-75">(' + (p.actions.includes('*') ? 'full' : p.actions.length) + ')</span></span>';
|
||||||
|
}).join('');
|
||||||
|
badgeContainer.innerHTML = badges || '<span class="badge bg-secondary bg-opacity-10 text-secondary">No policies</span>';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
var userIndex = users.findIndex(function(u) { return u.access_key === key; });
|
||||||
|
if (userIndex >= 0 && data.policies) {
|
||||||
|
users[userIndex].policies = data.policies;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
var editUserForm = document.getElementById('editUserForm');
|
||||||
|
if (editUserForm) {
|
||||||
|
editUserForm.addEventListener('submit', function(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
var key = currentEditKey;
|
||||||
|
window.UICore.submitFormAjax(editUserForm, {
|
||||||
|
successMessage: 'User updated',
|
||||||
|
onSuccess: function(data) {
|
||||||
|
editUserModal.hide();
|
||||||
|
|
||||||
|
var newName = data.display_name || document.getElementById('editUserDisplayName').value;
|
||||||
|
var editBtn = document.querySelector('[data-edit-user="' + key + '"]');
|
||||||
|
if (editBtn) {
|
||||||
|
editBtn.setAttribute('data-display-name', newName);
|
||||||
|
var card = editBtn.closest('.iam-user-card');
|
||||||
|
if (card) {
|
||||||
|
var nameEl = card.querySelector('h6');
|
||||||
|
if (nameEl) {
|
||||||
|
nameEl.textContent = newName;
|
||||||
|
nameEl.title = newName;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
var userIndex = users.findIndex(function(u) { return u.access_key === key; });
|
||||||
|
if (userIndex >= 0) {
|
||||||
|
users[userIndex].display_name = newName;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (key === currentUserKey) {
|
||||||
|
document.querySelectorAll('.sidebar-user .user-name').forEach(function(el) {
|
||||||
|
var truncated = newName.length > 16 ? newName.substring(0, 16) + '...' : newName;
|
||||||
|
el.textContent = truncated;
|
||||||
|
el.title = newName;
|
||||||
|
});
|
||||||
|
document.querySelectorAll('.sidebar-user[data-username]').forEach(function(el) {
|
||||||
|
el.setAttribute('data-username', newName);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
var deleteUserForm = document.getElementById('deleteUserForm');
|
||||||
|
if (deleteUserForm) {
|
||||||
|
deleteUserForm.addEventListener('submit', function(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
var key = currentDeleteKey;
|
||||||
|
window.UICore.submitFormAjax(deleteUserForm, {
|
||||||
|
successMessage: 'User deleted',
|
||||||
|
onSuccess: function(data) {
|
||||||
|
deleteUserModal.hide();
|
||||||
|
|
||||||
|
if (key === currentUserKey) {
|
||||||
|
window.location.href = '/ui/';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
var deleteBtn = document.querySelector('[data-delete-user="' + key + '"]');
|
||||||
|
if (deleteBtn) {
|
||||||
|
var cardCol = deleteBtn.closest('[class*="col-"]');
|
||||||
|
if (cardCol) {
|
||||||
|
cardCol.remove();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
users = users.filter(function(u) { return u.access_key !== key; });
|
||||||
|
updateUserCount();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
init: init
|
||||||
|
};
|
||||||
|
})();
|
||||||
324
static/js/ui-core.js
Normal file
324
static/js/ui-core.js
Normal file
@@ -0,0 +1,324 @@
|
|||||||
|
window.UICore = (function() {
|
||||||
|
'use strict';
|
||||||
|
|
||||||
|
function getCsrfToken() {
|
||||||
|
const meta = document.querySelector('meta[name="csrf-token"]');
|
||||||
|
return meta ? meta.getAttribute('content') : '';
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatBytes(bytes) {
|
||||||
|
if (!Number.isFinite(bytes)) return bytes + ' bytes';
|
||||||
|
const units = ['bytes', 'KB', 'MB', 'GB', 'TB'];
|
||||||
|
let i = 0;
|
||||||
|
let size = bytes;
|
||||||
|
while (size >= 1024 && i < units.length - 1) {
|
||||||
|
size /= 1024;
|
||||||
|
i++;
|
||||||
|
}
|
||||||
|
return size.toFixed(i === 0 ? 0 : 1) + ' ' + units[i];
|
||||||
|
}
|
||||||
|
|
||||||
|
function escapeHtml(value) {
|
||||||
|
if (value === null || value === undefined) return '';
|
||||||
|
return String(value)
|
||||||
|
.replace(/&/g, '&')
|
||||||
|
.replace(/</g, '<')
|
||||||
|
.replace(/>/g, '>')
|
||||||
|
.replace(/"/g, '"')
|
||||||
|
.replace(/'/g, ''');
|
||||||
|
}
|
||||||
|
|
||||||
|
async function submitFormAjax(form, options) {
|
||||||
|
options = options || {};
|
||||||
|
var onSuccess = options.onSuccess || function() {};
|
||||||
|
var onError = options.onError || function() {};
|
||||||
|
var successMessage = options.successMessage || 'Operation completed';
|
||||||
|
|
||||||
|
var formData = new FormData(form);
|
||||||
|
var csrfToken = getCsrfToken();
|
||||||
|
var submitBtn = form.querySelector('[type="submit"]');
|
||||||
|
var originalHtml = submitBtn ? submitBtn.innerHTML : '';
|
||||||
|
|
||||||
|
try {
|
||||||
|
if (submitBtn) {
|
||||||
|
submitBtn.disabled = true;
|
||||||
|
submitBtn.innerHTML = '<span class="spinner-border spinner-border-sm me-1"></span>Saving...';
|
||||||
|
}
|
||||||
|
|
||||||
|
var formAction = form.getAttribute('action') || form.action;
|
||||||
|
var response = await fetch(formAction, {
|
||||||
|
method: form.getAttribute('method') || 'POST',
|
||||||
|
headers: {
|
||||||
|
'X-CSRFToken': csrfToken,
|
||||||
|
'Accept': 'application/json',
|
||||||
|
'X-Requested-With': 'XMLHttpRequest'
|
||||||
|
},
|
||||||
|
body: formData,
|
||||||
|
redirect: 'follow'
|
||||||
|
});
|
||||||
|
|
||||||
|
var contentType = response.headers.get('content-type') || '';
|
||||||
|
if (!contentType.includes('application/json')) {
|
||||||
|
throw new Error('Server returned an unexpected response. Please try again.');
|
||||||
|
}
|
||||||
|
|
||||||
|
var data = await response.json();
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error(data.error || 'HTTP ' + response.status);
|
||||||
|
}
|
||||||
|
|
||||||
|
window.showToast(data.message || successMessage, 'Success', 'success');
|
||||||
|
onSuccess(data);
|
||||||
|
|
||||||
|
} catch (err) {
|
||||||
|
window.showToast(err.message, 'Error', 'error');
|
||||||
|
onError(err);
|
||||||
|
} finally {
|
||||||
|
if (submitBtn) {
|
||||||
|
submitBtn.disabled = false;
|
||||||
|
submitBtn.innerHTML = originalHtml;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function PollingManager() {
|
||||||
|
this.intervals = {};
|
||||||
|
this.callbacks = {};
|
||||||
|
this.timers = {};
|
||||||
|
this.defaults = {
|
||||||
|
replication: 30000,
|
||||||
|
lifecycle: 60000,
|
||||||
|
connectionHealth: 60000,
|
||||||
|
bucketStats: 120000
|
||||||
|
};
|
||||||
|
this._loadSettings();
|
||||||
|
}
|
||||||
|
|
||||||
|
PollingManager.prototype._loadSettings = function() {
|
||||||
|
try {
|
||||||
|
var stored = localStorage.getItem('myfsio-polling-intervals');
|
||||||
|
if (stored) {
|
||||||
|
var settings = JSON.parse(stored);
|
||||||
|
for (var key in settings) {
|
||||||
|
if (settings.hasOwnProperty(key)) {
|
||||||
|
this.defaults[key] = settings[key];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (e) {
|
||||||
|
console.warn('Failed to load polling settings:', e);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
PollingManager.prototype.saveSettings = function(settings) {
|
||||||
|
try {
|
||||||
|
for (var key in settings) {
|
||||||
|
if (settings.hasOwnProperty(key)) {
|
||||||
|
this.defaults[key] = settings[key];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
localStorage.setItem('myfsio-polling-intervals', JSON.stringify(this.defaults));
|
||||||
|
} catch (e) {
|
||||||
|
console.warn('Failed to save polling settings:', e);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
PollingManager.prototype.start = function(key, callback, interval) {
|
||||||
|
this.stop(key);
|
||||||
|
var ms = interval !== undefined ? interval : (this.defaults[key] || 30000);
|
||||||
|
if (ms <= 0) return;
|
||||||
|
|
||||||
|
this.callbacks[key] = callback;
|
||||||
|
this.intervals[key] = ms;
|
||||||
|
|
||||||
|
callback();
|
||||||
|
|
||||||
|
var self = this;
|
||||||
|
this.timers[key] = setInterval(function() {
|
||||||
|
if (!document.hidden) {
|
||||||
|
callback();
|
||||||
|
}
|
||||||
|
}, ms);
|
||||||
|
};
|
||||||
|
|
||||||
|
PollingManager.prototype.stop = function(key) {
|
||||||
|
if (this.timers[key]) {
|
||||||
|
clearInterval(this.timers[key]);
|
||||||
|
delete this.timers[key];
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
PollingManager.prototype.stopAll = function() {
|
||||||
|
for (var key in this.timers) {
|
||||||
|
if (this.timers.hasOwnProperty(key)) {
|
||||||
|
clearInterval(this.timers[key]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
this.timers = {};
|
||||||
|
};
|
||||||
|
|
||||||
|
PollingManager.prototype.updateInterval = function(key, newInterval) {
|
||||||
|
var callback = this.callbacks[key];
|
||||||
|
this.defaults[key] = newInterval;
|
||||||
|
this.saveSettings(this.defaults);
|
||||||
|
if (callback) {
|
||||||
|
this.start(key, callback, newInterval);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
PollingManager.prototype.getSettings = function() {
|
||||||
|
var result = {};
|
||||||
|
for (var key in this.defaults) {
|
||||||
|
if (this.defaults.hasOwnProperty(key)) {
|
||||||
|
result[key] = this.defaults[key];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
};
|
||||||
|
|
||||||
|
var pollingManager = new PollingManager();
|
||||||
|
|
||||||
|
document.addEventListener('visibilitychange', function() {
|
||||||
|
if (document.hidden) {
|
||||||
|
pollingManager.stopAll();
|
||||||
|
} else {
|
||||||
|
for (var key in pollingManager.callbacks) {
|
||||||
|
if (pollingManager.callbacks.hasOwnProperty(key)) {
|
||||||
|
pollingManager.start(key, pollingManager.callbacks[key], pollingManager.intervals[key]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
getCsrfToken: getCsrfToken,
|
||||||
|
formatBytes: formatBytes,
|
||||||
|
escapeHtml: escapeHtml,
|
||||||
|
submitFormAjax: submitFormAjax,
|
||||||
|
PollingManager: PollingManager,
|
||||||
|
pollingManager: pollingManager
|
||||||
|
};
|
||||||
|
})();
|
||||||
|
|
||||||
|
window.pollingManager = window.UICore.pollingManager;
|
||||||
|
|
||||||
|
window.UICore.copyToClipboard = async function(text, button, originalText) {
|
||||||
|
try {
|
||||||
|
await navigator.clipboard.writeText(text);
|
||||||
|
if (button) {
|
||||||
|
var prevText = button.textContent;
|
||||||
|
button.textContent = 'Copied!';
|
||||||
|
setTimeout(function() {
|
||||||
|
button.textContent = originalText || prevText;
|
||||||
|
}, 1500);
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Copy failed:', err);
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
window.UICore.setButtonLoading = function(button, isLoading, loadingText) {
|
||||||
|
if (!button) return;
|
||||||
|
if (isLoading) {
|
||||||
|
button._originalHtml = button.innerHTML;
|
||||||
|
button._originalDisabled = button.disabled;
|
||||||
|
button.disabled = true;
|
||||||
|
button.innerHTML = '<span class="spinner-border spinner-border-sm me-1"></span>' + (loadingText || 'Loading...');
|
||||||
|
} else {
|
||||||
|
button.disabled = button._originalDisabled || false;
|
||||||
|
button.innerHTML = button._originalHtml || button.innerHTML;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
window.UICore.updateBadgeCount = function(selector, count, singular, plural) {
|
||||||
|
var badge = document.querySelector(selector);
|
||||||
|
if (badge) {
|
||||||
|
var label = count === 1 ? (singular || '') : (plural || 's');
|
||||||
|
badge.textContent = count + ' ' + label;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
window.UICore.setupJsonAutoIndent = function(textarea) {
|
||||||
|
if (!textarea) return;
|
||||||
|
|
||||||
|
textarea.addEventListener('keydown', function(e) {
|
||||||
|
if (e.key === 'Enter') {
|
||||||
|
e.preventDefault();
|
||||||
|
|
||||||
|
var start = this.selectionStart;
|
||||||
|
var end = this.selectionEnd;
|
||||||
|
var value = this.value;
|
||||||
|
|
||||||
|
var lineStart = value.lastIndexOf('\n', start - 1) + 1;
|
||||||
|
var currentLine = value.substring(lineStart, start);
|
||||||
|
|
||||||
|
var indentMatch = currentLine.match(/^(\s*)/);
|
||||||
|
var indent = indentMatch ? indentMatch[1] : '';
|
||||||
|
|
||||||
|
var trimmedLine = currentLine.trim();
|
||||||
|
var lastChar = trimmedLine.slice(-1);
|
||||||
|
|
||||||
|
var newIndent = indent;
|
||||||
|
var insertAfter = '';
|
||||||
|
|
||||||
|
if (lastChar === '{' || lastChar === '[') {
|
||||||
|
newIndent = indent + ' ';
|
||||||
|
|
||||||
|
var charAfterCursor = value.substring(start, start + 1).trim();
|
||||||
|
if ((lastChar === '{' && charAfterCursor === '}') ||
|
||||||
|
(lastChar === '[' && charAfterCursor === ']')) {
|
||||||
|
insertAfter = '\n' + indent;
|
||||||
|
}
|
||||||
|
} else if (lastChar === ',' || lastChar === ':') {
|
||||||
|
newIndent = indent;
|
||||||
|
}
|
||||||
|
|
||||||
|
var insertion = '\n' + newIndent + insertAfter;
|
||||||
|
var newValue = value.substring(0, start) + insertion + value.substring(end);
|
||||||
|
|
||||||
|
this.value = newValue;
|
||||||
|
|
||||||
|
var newCursorPos = start + 1 + newIndent.length;
|
||||||
|
this.selectionStart = this.selectionEnd = newCursorPos;
|
||||||
|
|
||||||
|
this.dispatchEvent(new Event('input', { bubbles: true }));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (e.key === 'Tab') {
|
||||||
|
e.preventDefault();
|
||||||
|
var start = this.selectionStart;
|
||||||
|
var end = this.selectionEnd;
|
||||||
|
|
||||||
|
if (e.shiftKey) {
|
||||||
|
var lineStart = this.value.lastIndexOf('\n', start - 1) + 1;
|
||||||
|
var lineContent = this.value.substring(lineStart, start);
|
||||||
|
if (lineContent.startsWith(' ')) {
|
||||||
|
this.value = this.value.substring(0, lineStart) +
|
||||||
|
this.value.substring(lineStart + 2);
|
||||||
|
this.selectionStart = this.selectionEnd = Math.max(lineStart, start - 2);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
this.value = this.value.substring(0, start) + ' ' + this.value.substring(end);
|
||||||
|
this.selectionStart = this.selectionEnd = start + 2;
|
||||||
|
}
|
||||||
|
|
||||||
|
this.dispatchEvent(new Event('input', { bubbles: true }));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
document.addEventListener('DOMContentLoaded', function() {
|
||||||
|
var flashMessage = sessionStorage.getItem('flashMessage');
|
||||||
|
if (flashMessage) {
|
||||||
|
sessionStorage.removeItem('flashMessage');
|
||||||
|
try {
|
||||||
|
var msg = JSON.parse(flashMessage);
|
||||||
|
if (window.showToast) {
|
||||||
|
window.showToast(msg.body || msg.title, msg.title, msg.variant || 'info');
|
||||||
|
}
|
||||||
|
} catch (e) {}
|
||||||
|
}
|
||||||
|
});
|
||||||
@@ -24,105 +24,218 @@
|
|||||||
document.documentElement.dataset.bsTheme = 'light';
|
document.documentElement.dataset.bsTheme = 'light';
|
||||||
document.documentElement.dataset.theme = 'light';
|
document.documentElement.dataset.theme = 'light';
|
||||||
}
|
}
|
||||||
|
try {
|
||||||
|
if (localStorage.getItem('myfsio-sidebar-collapsed') === 'true') {
|
||||||
|
document.documentElement.classList.add('sidebar-will-collapse');
|
||||||
|
}
|
||||||
|
} catch (err) {}
|
||||||
})();
|
})();
|
||||||
</script>
|
</script>
|
||||||
<link rel="stylesheet" href="{{ url_for('static', filename='css/main.css') }}" />
|
<link rel="stylesheet" href="{{ url_for('static', filename='css/main.css') }}" />
|
||||||
</head>
|
</head>
|
||||||
<body>
|
<body>
|
||||||
<nav class="navbar navbar-expand-lg myfsio-nav shadow-sm">
|
<header class="mobile-header d-lg-none">
|
||||||
<div class="container-fluid">
|
<button class="sidebar-toggle-btn" type="button" data-bs-toggle="offcanvas" data-bs-target="#mobileSidebar" aria-controls="mobileSidebar" aria-label="Toggle navigation">
|
||||||
<a class="navbar-brand fw-semibold" href="{{ url_for('ui.buckets_overview') }}">
|
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" fill="currentColor" viewBox="0 0 16 16">
|
||||||
<img
|
<path fill-rule="evenodd" d="M2.5 12a.5.5 0 0 1 .5-.5h10a.5.5 0 0 1 0 1H3a.5.5 0 0 1-.5-.5zm0-4a.5.5 0 0 1 .5-.5h10a.5.5 0 0 1 0 1H3a.5.5 0 0 1-.5-.5zm0-4a.5.5 0 0 1 .5-.5h10a.5.5 0 0 1 0 1H3a.5.5 0 0 1-.5-.5z"/>
|
||||||
src="{{ url_for('static', filename='images/MyFSIO.png') }}"
|
</svg>
|
||||||
alt="MyFSIO logo"
|
</button>
|
||||||
class="myfsio-logo"
|
<a class="mobile-brand" href="{{ url_for('ui.buckets_overview') }}">
|
||||||
width="32"
|
<img src="{{ url_for('static', filename='images/MyFSIO.png') }}" alt="MyFSIO logo" width="28" height="28" />
|
||||||
height="32"
|
<span>MyFSIO</span>
|
||||||
decoding="async"
|
</a>
|
||||||
/>
|
<button class="theme-toggle-mobile" type="button" id="themeToggleMobile" aria-label="Toggle dark mode">
|
||||||
<span class="myfsio-title">MyFSIO</span>
|
<svg xmlns="http://www.w3.org/2000/svg" width="18" height="18" fill="currentColor" class="theme-icon-mobile" id="themeToggleSunMobile" viewBox="0 0 16 16">
|
||||||
|
<path d="M8 11.5a3.5 3.5 0 1 1 0-7 3.5 3.5 0 0 1 0 7zm0 1.5a5 5 0 1 0 0-10 5 5 0 0 0 0 10zM8 0a.5.5 0 0 1 .5.5v1.555a.5.5 0 0 1-1 0V.5A.5.5 0 0 1 8 0zm0 12.945a.5.5 0 0 1 .5.5v2.055a.5.5 0 0 1-1 0v-2.055a.5.5 0 0 1 .5-.5zM2.343 2.343a.5.5 0 0 1 .707 0l1.1 1.1a.5.5 0 1 1-.708.707l-1.1-1.1a.5.5 0 0 1 0-.707zm9.507 9.507a.5.5 0 0 1 .707 0l1.1 1.1a.5.5 0 1 1-.707.708l-1.1-1.1a.5.5 0 0 1 0-.708zM0 8a.5.5 0 0 1 .5-.5h1.555a.5.5 0 0 1 0 1H.5A.5.5 0 0 1 0 8zm12.945 0a.5.5 0 0 1 .5-.5H15.5a.5.5 0 0 1 0 1h-2.055a.5.5 0 0 1-.5-.5zM2.343 13.657a.5.5 0 0 1 0-.707l1.1-1.1a.5.5 0 1 1 .708.707l-1.1 1.1a.5.5 0 0 1-.708 0zm9.507-9.507a.5.5 0 0 1 0-.707l1.1-1.1a.5.5 0 0 1 .707.708l-1.1 1.1a.5.5 0 0 1-.707 0z"/>
|
||||||
|
</svg>
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="18" height="18" fill="currentColor" class="theme-icon-mobile" id="themeToggleMoonMobile" viewBox="0 0 16 16">
|
||||||
|
<path d="M6 .278a.768.768 0 0 1 .08.858 7.208 7.208 0 0 0-.878 3.46c0 4.021 3.278 7.277 7.318 7.277.527 0 1.04-.055 1.533-.16a.787.787 0 0 1 .81.316.733.733 0 0 1-.031.893A8.349 8.349 0 0 1 8.344 16C3.734 16 0 12.286 0 7.71 0 4.266 2.114 1.312 5.124.06A.752.752 0 0 1 6 .278z"/>
|
||||||
|
<path d="M10.794 3.148a.217.217 0 0 1 .412 0l.387 1.162c.173.518.579.924 1.097 1.097l1.162.387a.217.217 0 0 1 0 .412l-1.162.387a1.734 1.734 0 0 0-1.097 1.097l-.387 1.162a.217.217 0 0 1-.412 0l-.387-1.162A1.734 1.734 0 0 0 9.31 6.593l-1.162-.387a.217.217 0 0 1 0-.412l1.162-.387a1.734 1.734 0 0 0 1.097-1.097l.387-1.162zM13.863.099a.145.145 0 0 1 .274 0l.258.774c.115.346.386.617.732.732l.774.258a.145.145 0 0 1 0 .274l-.774.258a1.156 1.156 0 0 0-.732.732l-.258.774a.145.145 0 0 1-.274 0l-.258-.774a1.156 1.156 0 0 0-.732-.732l-.774-.258a.145.145 0 0 1 0-.274l.774-.258c.346-.115.617-.386.732-.732L13.863.1z"/>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
</header>
|
||||||
|
|
||||||
|
<div class="offcanvas offcanvas-start sidebar-offcanvas" tabindex="-1" id="mobileSidebar" aria-labelledby="mobileSidebarLabel">
|
||||||
|
<div class="offcanvas-header sidebar-header">
|
||||||
|
<a class="sidebar-brand" href="{{ url_for('ui.buckets_overview') }}">
|
||||||
|
<img src="{{ url_for('static', filename='images/MyFSIO.png') }}" alt="MyFSIO logo" class="sidebar-logo" width="36" height="36" />
|
||||||
|
<span class="sidebar-title">MyFSIO</span>
|
||||||
</a>
|
</a>
|
||||||
<button class="navbar-toggler" type="button" data-bs-toggle="collapse" data-bs-target="#navContent" aria-controls="navContent" aria-expanded="false" aria-label="Toggle navigation">
|
<button type="button" class="btn-close btn-close-white" data-bs-dismiss="offcanvas" aria-label="Close"></button>
|
||||||
<span class="navbar-toggler-icon"></span>
|
</div>
|
||||||
</button>
|
<div class="offcanvas-body sidebar-body">
|
||||||
<div class="collapse navbar-collapse" id="navContent">
|
<nav class="sidebar-nav">
|
||||||
<ul class="navbar-nav me-auto mb-2 mb-lg-0">
|
{% if principal %}
|
||||||
{% if principal %}
|
<div class="nav-section">
|
||||||
<li class="nav-item">
|
<span class="nav-section-title">Navigation</span>
|
||||||
<a class="nav-link" href="{{ url_for('ui.buckets_overview') }}">Buckets</a>
|
<a href="{{ url_for('ui.buckets_overview') }}" class="sidebar-link {% if request.endpoint == 'ui.buckets_overview' or request.endpoint == 'ui.bucket_detail' %}active{% endif %}">
|
||||||
</li>
|
<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" fill="currentColor" viewBox="0 0 16 16">
|
||||||
|
<path d="M2.522 5H2a.5.5 0 0 0-.494.574l1.372 9.149A1.5 1.5 0 0 0 4.36 16h7.278a1.5 1.5 0 0 0 1.483-1.277l1.373-9.149A.5.5 0 0 0 14 5h-.522A5.5 5.5 0 0 0 2.522 5zm1.005 0a4.5 4.5 0 0 1 8.945 0H3.527z"/>
|
||||||
|
</svg>
|
||||||
|
<span>Buckets</span>
|
||||||
|
</a>
|
||||||
{% if can_manage_iam %}
|
{% if can_manage_iam %}
|
||||||
<li class="nav-item">
|
<a href="{{ url_for('ui.iam_dashboard') }}" class="sidebar-link {% if request.endpoint == 'ui.iam_dashboard' %}active{% endif %}">
|
||||||
<a class="nav-link" href="{{ url_for('ui.iam_dashboard') }}">IAM</a>
|
<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" fill="currentColor" viewBox="0 0 16 16">
|
||||||
</li>
|
<path d="M15 14s1 0 1-1-1-4-5-4-5 3-5 4 1 1 1 1h8zm-7.978-1A.261.261 0 0 1 7 12.996c.001-.264.167-1.03.76-1.72C8.312 10.629 9.282 10 11 10c1.717 0 2.687.63 3.24 1.276.593.69.758 1.457.76 1.72l-.008.002a.274.274 0 0 1-.014.002H7.022zM11 7a2 2 0 1 0 0-4 2 2 0 0 0 0 4zm3-2a3 3 0 1 1-6 0 3 3 0 0 1 6 0zM6.936 9.28a5.88 5.88 0 0 0-1.23-.247A7.35 7.35 0 0 0 5 9c-4 0-5 3-5 4 0 .667.333 1 1 1h4.216A2.238 2.238 0 0 1 5 13c0-1.01.377-2.042 1.09-2.904.243-.294.526-.569.846-.816zM4.92 10A5.493 5.493 0 0 0 4 13H1c0-.26.164-1.03.76-1.724.545-.636 1.492-1.256 3.16-1.275zM1.5 5.5a3 3 0 1 1 6 0 3 3 0 0 1-6 0zm3-2a2 2 0 1 0 0 4 2 2 0 0 0 0-4z"/>
|
||||||
<li class="nav-item">
|
|
||||||
<a class="nav-link" href="{{ url_for('ui.connections_dashboard') }}">Connections</a>
|
|
||||||
</li>
|
|
||||||
<li class="nav-item">
|
|
||||||
<a class="nav-link" href="{{ url_for('ui.metrics_dashboard') }}">Metrics</a>
|
|
||||||
</li>
|
|
||||||
{% endif %}
|
|
||||||
{% endif %}
|
|
||||||
{% if principal %}
|
|
||||||
<li class="nav-item">
|
|
||||||
<a class="nav-link" href="{{ url_for('ui.docs_page') }}">Docs</a>
|
|
||||||
</li>
|
|
||||||
{% endif %}
|
|
||||||
</ul>
|
|
||||||
<div class="ms-lg-auto d-flex align-items-center gap-3 text-light flex-wrap">
|
|
||||||
<button
|
|
||||||
class="btn btn-outline-light btn-sm theme-toggle"
|
|
||||||
type="button"
|
|
||||||
id="themeToggle"
|
|
||||||
aria-pressed="false"
|
|
||||||
aria-label="Toggle dark mode"
|
|
||||||
>
|
|
||||||
<span id="themeToggleLabel" class="visually-hidden">Toggle dark mode</span>
|
|
||||||
<svg
|
|
||||||
xmlns="http://www.w3.org/2000/svg"
|
|
||||||
width="16"
|
|
||||||
height="16"
|
|
||||||
fill="currentColor"
|
|
||||||
class="theme-icon"
|
|
||||||
id="themeToggleSun"
|
|
||||||
viewBox="0 0 16 16"
|
|
||||||
aria-hidden="true"
|
|
||||||
>
|
|
||||||
<path
|
|
||||||
d="M8 11.5a3.5 3.5 0 1 1 0-7 3.5 3.5 0 0 1 0 7zm0 1.5a5 5 0 1 0 0-10 5 5 0 0 0 0 10zM8 0a.5.5 0 0 1 .5.5v1.555a.5.5 0 0 1-1 0V.5A.5.5 0 0 1 8 0zm0 12.945a.5.5 0 0 1 .5.5v2.055a.5.5 0 0 1-1 0v-2.055a.5.5 0 0 1 .5-.5zM2.343 2.343a.5.5 0 0 1 .707 0l1.1 1.1a.5.5 0 1 1-.708.707l-1.1-1.1a.5.5 0 0 1 0-.707zm9.507 9.507a.5.5 0 0 1 .707 0l1.1 1.1a.5.5 0 1 1-.707.708l-1.1-1.1a.5.5 0 0 1 0-.708zM0 8a.5.5 0 0 1 .5-.5h1.555a.5.5 0 0 1 0 1H.5A.5.5 0 0 1 0 8zm12.945 0a.5.5 0 0 1 .5-.5H15.5a.5.5 0 0 1 0 1h-2.055a.5.5 0 0 1-.5-.5zM2.343 13.657a.5.5 0 0 1 0-.707l1.1-1.1a.5.5 0 1 1 .708.707l-1.1 1.1a.5.5 0 0 1-.708 0zm9.507-9.507a.5.5 0 0 1 0-.707l1.1-1.1a.5.5 0 0 1 .707.708l-1.1 1.1a.5.5 0 0 1-.707 0z"
|
|
||||||
/>
|
|
||||||
</svg>
|
</svg>
|
||||||
<svg
|
<span>IAM</span>
|
||||||
xmlns="http://www.w3.org/2000/svg"
|
</a>
|
||||||
width="16"
|
<a href="{{ url_for('ui.connections_dashboard') }}" class="sidebar-link {% if request.endpoint == 'ui.connections_dashboard' %}active{% endif %}">
|
||||||
height="16"
|
<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" fill="currentColor" viewBox="0 0 16 16">
|
||||||
fill="currentColor"
|
<path fill-rule="evenodd" d="M6 3.5A1.5 1.5 0 0 1 7.5 2h1A1.5 1.5 0 0 1 10 3.5v1A1.5 1.5 0 0 1 8.5 6v1H14a.5.5 0 0 1 .5.5v1a.5.5 0 0 1-1 0V8h-5v.5a.5.5 0 0 1-1 0V8h-5v.5a.5.5 0 0 1-1 0v-1A.5.5 0 0 1 2 7h5.5V6A1.5 1.5 0 0 1 6 4.5v-1zM8.5 5a.5.5 0 0 0 .5-.5v-1a.5.5 0 0 0-.5-.5h-1a.5.5 0 0 0-.5.5v1a.5.5 0 0 0 .5.5h1zM0 11.5A1.5 1.5 0 0 1 1.5 10h1A1.5 1.5 0 0 1 4 11.5v1A1.5 1.5 0 0 1 2.5 14h-1A1.5 1.5 0 0 1 0 12.5v-1zm1.5-.5a.5.5 0 0 0-.5.5v1a.5.5 0 0 0 .5.5h1a.5.5 0 0 0 .5-.5v-1a.5.5 0 0 0-.5-.5h-1zm4.5.5A1.5 1.5 0 0 1 7.5 10h1a1.5 1.5 0 0 1 1.5 1.5v1A1.5 1.5 0 0 1 8.5 14h-1A1.5 1.5 0 0 1 6 12.5v-1zm1.5-.5a.5.5 0 0 0-.5.5v1a.5.5 0 0 0 .5.5h1a.5.5 0 0 0 .5-.5v-1a.5.5 0 0 0-.5-.5h-1zm4.5.5a1.5 1.5 0 0 1 1.5-1.5h1a1.5 1.5 0 0 1 1.5 1.5v1a1.5 1.5 0 0 1-1.5 1.5h-1a1.5 1.5 0 0 1-1.5-1.5v-1zm1.5-.5a.5.5 0 0 0-.5.5v1a.5.5 0 0 0 .5.5h1a.5.5 0 0 0 .5-.5v-1a.5.5 0 0 0-.5-.5h-1z"/>
|
||||||
class="theme-icon d-none"
|
|
||||||
id="themeToggleMoon"
|
|
||||||
viewBox="0 0 16 16"
|
|
||||||
aria-hidden="true"
|
|
||||||
>
|
|
||||||
<path d="M6 .278a.768.768 0 0 1 .08.858 7.208 7.208 0 0 0-.878 3.46c0 4.021 3.278 7.277 7.318 7.277.527 0 1.04-.055 1.533-.16a.787.787 0 0 1 .81.316.733.733 0 0 1-.031.893A8.349 8.349 0 0 1 8.344 16C3.734 16 0 12.286 0 7.71 0 4.266 2.114 1.312 5.124.06A.752.752 0 0 1 6 .278z"/>
|
|
||||||
<path d="M10.794 3.148a.217.217 0 0 1 .412 0l.387 1.162c.173.518.579.924 1.097 1.097l1.162.387a.217.217 0 0 1 0 .412l-1.162.387a1.734 1.734 0 0 0-1.097 1.097l-.387 1.162a.217.217 0 0 1-.412 0l-.387-1.162A1.734 1.734 0 0 0 9.31 6.593l-1.162-.387a.217.217 0 0 1 0-.412l1.162-.387a1.734 1.734 0 0 0 1.097-1.097l.387-1.162zM13.863.099a.145.145 0 0 1 .274 0l.258.774c.115.346.386.617.732.732l.774.258a.145.145 0 0 1 0 .274l-.774.258a1.156 1.156 0 0 0-.732.732l-.258.774a.145.145 0 0 1-.274 0l-.258-.774a1.156 1.156 0 0 0-.732-.732l-.774-.258a.145.145 0 0 1 0-.274l.774-.258c.346-.115.617-.386.732-.732L13.863.1z"/>
|
|
||||||
</svg>
|
</svg>
|
||||||
</button>
|
<span>Connections</span>
|
||||||
{% if principal %}
|
</a>
|
||||||
<div class="text-end small">
|
<a href="{{ url_for('ui.metrics_dashboard') }}" class="sidebar-link {% if request.endpoint == 'ui.metrics_dashboard' %}active{% endif %}">
|
||||||
<div class="fw-semibold" title="{{ principal.display_name }}">{{ principal.display_name | truncate(20, true) }}</div>
|
<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" fill="currentColor" viewBox="0 0 16 16">
|
||||||
<div class="opacity-75">{{ principal.access_key }}</div>
|
<path d="M8 4a.5.5 0 0 1 .5.5V6a.5.5 0 0 1-1 0V4.5A.5.5 0 0 1 8 4zM3.732 5.732a.5.5 0 0 1 .707 0l.915.914a.5.5 0 1 1-.708.708l-.914-.915a.5.5 0 0 1 0-.707zM2 10a.5.5 0 0 1 .5-.5h1.586a.5.5 0 0 1 0 1H2.5A.5.5 0 0 1 2 10zm9.5 0a.5.5 0 0 1 .5-.5h1.5a.5.5 0 0 1 0 1H12a.5.5 0 0 1-.5-.5zm.754-4.246a.389.389 0 0 0-.527-.02L7.547 9.31a.91.91 0 1 0 1.302 1.258l3.434-4.297a.389.389 0 0 0-.029-.518z"/>
|
||||||
</div>
|
<path fill-rule="evenodd" d="M0 10a8 8 0 1 1 15.547 2.661c-.442 1.253-1.845 1.602-2.932 1.25C11.309 13.488 9.475 13 8 13c-1.474 0-3.31.488-4.615.911-1.087.352-2.49.003-2.932-1.25A7.988 7.988 0 0 1 0 10zm8-7a7 7 0 0 0-6.603 9.329c.203.575.923.876 1.68.63C4.397 12.533 6.358 12 8 12s3.604.532 4.923.96c.757.245 1.477-.056 1.68-.631A7 7 0 0 0 8 3z"/>
|
||||||
<form method="post" action="{{ url_for('ui.logout') }}">
|
</svg>
|
||||||
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}" />
|
<span>Metrics</span>
|
||||||
<button class="btn btn-outline-light btn-sm" type="submit">Sign out</button>
|
</a>
|
||||||
</form>
|
|
||||||
{% endif %}
|
{% endif %}
|
||||||
</div>
|
</div>
|
||||||
|
<div class="nav-section">
|
||||||
|
<span class="nav-section-title">Resources</span>
|
||||||
|
<a href="{{ url_for('ui.docs_page') }}" class="sidebar-link {% if request.endpoint == 'ui.docs_page' %}active{% endif %}">
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" fill="currentColor" viewBox="0 0 16 16">
|
||||||
|
<path d="M1 2.828c.885-.37 2.154-.769 3.388-.893 1.33-.134 2.458.063 3.112.752v9.746c-.935-.53-2.12-.603-3.213-.493-1.18.12-2.37.461-3.287.811V2.828zm7.5-.141c.654-.689 1.782-.886 3.112-.752 1.234.124 2.503.523 3.388.893v9.923c-.918-.35-2.107-.692-3.287-.81-1.094-.111-2.278-.039-3.213.492V2.687zM8 1.783C7.015.936 5.587.81 4.287.94c-1.514.153-3.042.672-3.994 1.105A.5.5 0 0 0 0 2.5v11a.5.5 0 0 0 .707.455c.882-.4 2.303-.881 3.68-1.02 1.409-.142 2.59.087 3.223.877a.5.5 0 0 0 .78 0c.633-.79 1.814-1.019 3.222-.877 1.378.139 2.8.62 3.681 1.02A.5.5 0 0 0 16 13.5v-11a.5.5 0 0 0-.293-.455c-.952-.433-2.48-.952-3.994-1.105C10.413.809 8.985.936 8 1.783z"/>
|
||||||
|
</svg>
|
||||||
|
<span>Documentation</span>
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
</nav>
|
||||||
|
{% if principal %}
|
||||||
|
<div class="sidebar-footer">
|
||||||
|
<div class="sidebar-user">
|
||||||
|
<div class="user-avatar">
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="18" height="18" fill="currentColor" viewBox="0 0 16 16">
|
||||||
|
<path d="M11 6a3 3 0 1 1-6 0 3 3 0 0 1 6 0z"/>
|
||||||
|
<path fill-rule="evenodd" d="M0 8a8 8 0 1 1 16 0A8 8 0 0 1 0 8zm8-7a7 7 0 0 0-5.468 11.37C3.242 11.226 4.805 10 8 10s4.757 1.225 5.468 2.37A7 7 0 0 0 8 1z"/>
|
||||||
|
</svg>
|
||||||
|
</div>
|
||||||
|
<div class="user-info">
|
||||||
|
<div class="user-name" title="{{ principal.display_name }}">{{ principal.display_name | truncate(16, true) }}</div>
|
||||||
|
<div class="user-key">{{ principal.access_key | truncate(12, true) }}</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<form method="post" action="{{ url_for('ui.logout') }}" class="w-100">
|
||||||
|
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}" />
|
||||||
|
<button class="sidebar-logout-btn" type="submit">
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="18" height="18" fill="currentColor" viewBox="0 0 16 16">
|
||||||
|
<path fill-rule="evenodd" d="M10 12.5a.5.5 0 0 1-.5.5h-8a.5.5 0 0 1-.5-.5v-9a.5.5 0 0 1 .5-.5h8a.5.5 0 0 1 .5.5v2a.5.5 0 0 0 1 0v-2A1.5 1.5 0 0 0 9.5 2h-8A1.5 1.5 0 0 0 0 3.5v9A1.5 1.5 0 0 0 1.5 14h8a1.5 1.5 0 0 0 1.5-1.5v-2a.5.5 0 0 0-1 0v2z"/>
|
||||||
|
<path fill-rule="evenodd" d="M15.854 8.354a.5.5 0 0 0 0-.708l-3-3a.5.5 0 0 0-.708.708L14.293 7.5H5.5a.5.5 0 0 0 0 1h8.793l-2.147 2.146a.5.5 0 0 0 .708.708l3-3z"/>
|
||||||
|
</svg>
|
||||||
|
<span>Sign out</span>
|
||||||
|
</button>
|
||||||
|
</form>
|
||||||
</div>
|
</div>
|
||||||
|
{% endif %}
|
||||||
</div>
|
</div>
|
||||||
</nav>
|
</div>
|
||||||
<main class="container py-4">
|
|
||||||
{% block content %}{% endblock %}
|
<aside class="sidebar d-none d-lg-flex" id="desktopSidebar">
|
||||||
</main>
|
<div class="sidebar-header">
|
||||||
|
<div class="sidebar-brand" id="sidebarBrand">
|
||||||
|
<img src="{{ url_for('static', filename='images/MyFSIO.png') }}" alt="MyFSIO logo" class="sidebar-logo" width="36" height="36" />
|
||||||
|
<span class="sidebar-title">MyFSIO</span>
|
||||||
|
</div>
|
||||||
|
<button class="sidebar-collapse-btn" type="button" id="sidebarCollapseBtn" aria-label="Collapse sidebar">
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="18" height="18" fill="currentColor" viewBox="0 0 16 16">
|
||||||
|
<path fill-rule="evenodd" d="M11.354 1.646a.5.5 0 0 1 0 .708L5.707 8l5.647 5.646a.5.5 0 0 1-.708.708l-6-6a.5.5 0 0 1 0-.708l6-6a.5.5 0 0 1 .708 0z"/>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
<div class="sidebar-body">
|
||||||
|
<nav class="sidebar-nav">
|
||||||
|
{% if principal %}
|
||||||
|
<div class="nav-section">
|
||||||
|
<span class="nav-section-title">Navigation</span>
|
||||||
|
<a href="{{ url_for('ui.buckets_overview') }}" class="sidebar-link {% if request.endpoint == 'ui.buckets_overview' or request.endpoint == 'ui.bucket_detail' %}active{% endif %}" data-tooltip="Buckets">
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" fill="currentColor" viewBox="0 0 16 16">
|
||||||
|
<path d="M2.522 5H2a.5.5 0 0 0-.494.574l1.372 9.149A1.5 1.5 0 0 0 4.36 16h7.278a1.5 1.5 0 0 0 1.483-1.277l1.373-9.149A.5.5 0 0 0 14 5h-.522A5.5 5.5 0 0 0 2.522 5zm1.005 0a4.5 4.5 0 0 1 8.945 0H3.527z"/>
|
||||||
|
</svg>
|
||||||
|
<span class="sidebar-link-text">Buckets</span>
|
||||||
|
</a>
|
||||||
|
{% if can_manage_iam %}
|
||||||
|
<a href="{{ url_for('ui.iam_dashboard') }}" class="sidebar-link {% if request.endpoint == 'ui.iam_dashboard' %}active{% endif %}" data-tooltip="IAM">
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" fill="currentColor" viewBox="0 0 16 16">
|
||||||
|
<path d="M15 14s1 0 1-1-1-4-5-4-5 3-5 4 1 1 1 1h8zm-7.978-1A.261.261 0 0 1 7 12.996c.001-.264.167-1.03.76-1.72C8.312 10.629 9.282 10 11 10c1.717 0 2.687.63 3.24 1.276.593.69.758 1.457.76 1.72l-.008.002a.274.274 0 0 1-.014.002H7.022zM11 7a2 2 0 1 0 0-4 2 2 0 0 0 0 4zm3-2a3 3 0 1 1-6 0 3 3 0 0 1 6 0zM6.936 9.28a5.88 5.88 0 0 0-1.23-.247A7.35 7.35 0 0 0 5 9c-4 0-5 3-5 4 0 .667.333 1 1 1h4.216A2.238 2.238 0 0 1 5 13c0-1.01.377-2.042 1.09-2.904.243-.294.526-.569.846-.816zM4.92 10A5.493 5.493 0 0 0 4 13H1c0-.26.164-1.03.76-1.724.545-.636 1.492-1.256 3.16-1.275zM1.5 5.5a3 3 0 1 1 6 0 3 3 0 0 1-6 0zm3-2a2 2 0 1 0 0 4 2 2 0 0 0 0-4z"/>
|
||||||
|
</svg>
|
||||||
|
<span class="sidebar-link-text">IAM</span>
|
||||||
|
</a>
|
||||||
|
<a href="{{ url_for('ui.connections_dashboard') }}" class="sidebar-link {% if request.endpoint == 'ui.connections_dashboard' %}active{% endif %}" data-tooltip="Connections">
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" fill="currentColor" viewBox="0 0 16 16">
|
||||||
|
<path fill-rule="evenodd" d="M6 3.5A1.5 1.5 0 0 1 7.5 2h1A1.5 1.5 0 0 1 10 3.5v1A1.5 1.5 0 0 1 8.5 6v1H14a.5.5 0 0 1 .5.5v1a.5.5 0 0 1-1 0V8h-5v.5a.5.5 0 0 1-1 0V8h-5v.5a.5.5 0 0 1-1 0v-1A.5.5 0 0 1 2 7h5.5V6A1.5 1.5 0 0 1 6 4.5v-1zM8.5 5a.5.5 0 0 0 .5-.5v-1a.5.5 0 0 0-.5-.5h-1a.5.5 0 0 0-.5.5v1a.5.5 0 0 0 .5.5h1zM0 11.5A1.5 1.5 0 0 1 1.5 10h1A1.5 1.5 0 0 1 4 11.5v1A1.5 1.5 0 0 1 2.5 14h-1A1.5 1.5 0 0 1 0 12.5v-1zm1.5-.5a.5.5 0 0 0-.5.5v1a.5.5 0 0 0 .5.5h1a.5.5 0 0 0 .5-.5v-1a.5.5 0 0 0-.5-.5h-1zm4.5.5A1.5 1.5 0 0 1 7.5 10h1a1.5 1.5 0 0 1 1.5 1.5v1A1.5 1.5 0 0 1 8.5 14h-1A1.5 1.5 0 0 1 6 12.5v-1zm1.5-.5a.5.5 0 0 0-.5.5v1a.5.5 0 0 0 .5.5h1a.5.5 0 0 0 .5-.5v-1a.5.5 0 0 0-.5-.5h-1zm4.5.5a1.5 1.5 0 0 1 1.5-1.5h1a1.5 1.5 0 0 1 1.5 1.5v1a1.5 1.5 0 0 1-1.5 1.5h-1a1.5 1.5 0 0 1-1.5-1.5v-1zm1.5-.5a.5.5 0 0 0-.5.5v1a.5.5 0 0 0 .5.5h1a.5.5 0 0 0 .5-.5v-1a.5.5 0 0 0-.5-.5h-1z"/>
|
||||||
|
</svg>
|
||||||
|
<span class="sidebar-link-text">Connections</span>
|
||||||
|
</a>
|
||||||
|
<a href="{{ url_for('ui.metrics_dashboard') }}" class="sidebar-link {% if request.endpoint == 'ui.metrics_dashboard' %}active{% endif %}" data-tooltip="Metrics">
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" fill="currentColor" viewBox="0 0 16 16">
|
||||||
|
<path d="M8 4a.5.5 0 0 1 .5.5V6a.5.5 0 0 1-1 0V4.5A.5.5 0 0 1 8 4zM3.732 5.732a.5.5 0 0 1 .707 0l.915.914a.5.5 0 1 1-.708.708l-.914-.915a.5.5 0 0 1 0-.707zM2 10a.5.5 0 0 1 .5-.5h1.586a.5.5 0 0 1 0 1H2.5A.5.5 0 0 1 2 10zm9.5 0a.5.5 0 0 1 .5-.5h1.5a.5.5 0 0 1 0 1H12a.5.5 0 0 1-.5-.5zm.754-4.246a.389.389 0 0 0-.527-.02L7.547 9.31a.91.91 0 1 0 1.302 1.258l3.434-4.297a.389.389 0 0 0-.029-.518z"/>
|
||||||
|
<path fill-rule="evenodd" d="M0 10a8 8 0 1 1 15.547 2.661c-.442 1.253-1.845 1.602-2.932 1.25C11.309 13.488 9.475 13 8 13c-1.474 0-3.31.488-4.615.911-1.087.352-2.49.003-2.932-1.25A7.988 7.988 0 0 1 0 10zm8-7a7 7 0 0 0-6.603 9.329c.203.575.923.876 1.68.63C4.397 12.533 6.358 12 8 12s3.604.532 4.923.96c.757.245 1.477-.056 1.68-.631A7 7 0 0 0 8 3z"/>
|
||||||
|
</svg>
|
||||||
|
<span class="sidebar-link-text">Metrics</span>
|
||||||
|
</a>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
<div class="nav-section">
|
||||||
|
<span class="nav-section-title">Resources</span>
|
||||||
|
<a href="{{ url_for('ui.docs_page') }}" class="sidebar-link {% if request.endpoint == 'ui.docs_page' %}active{% endif %}" data-tooltip="Documentation">
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" fill="currentColor" viewBox="0 0 16 16">
|
||||||
|
<path d="M1 2.828c.885-.37 2.154-.769 3.388-.893 1.33-.134 2.458.063 3.112.752v9.746c-.935-.53-2.12-.603-3.213-.493-1.18.12-2.37.461-3.287.811V2.828zm7.5-.141c.654-.689 1.782-.886 3.112-.752 1.234.124 2.503.523 3.388.893v9.923c-.918-.35-2.107-.692-3.287-.81-1.094-.111-2.278-.039-3.213.492V2.687zM8 1.783C7.015.936 5.587.81 4.287.94c-1.514.153-3.042.672-3.994 1.105A.5.5 0 0 0 0 2.5v11a.5.5 0 0 0 .707.455c.882-.4 2.303-.881 3.68-1.02 1.409-.142 2.59.087 3.223.877a.5.5 0 0 0 .78 0c.633-.79 1.814-1.019 3.222-.877 1.378.139 2.8.62 3.681 1.02A.5.5 0 0 0 16 13.5v-11a.5.5 0 0 0-.293-.455c-.952-.433-2.48-.952-3.994-1.105C10.413.809 8.985.936 8 1.783z"/>
|
||||||
|
</svg>
|
||||||
|
<span class="sidebar-link-text">Documentation</span>
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
</nav>
|
||||||
|
</div>
|
||||||
|
<div class="sidebar-footer">
|
||||||
|
<button class="theme-toggle-sidebar" type="button" id="themeToggle" aria-label="Toggle dark mode">
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="18" height="18" fill="currentColor" class="theme-icon" id="themeToggleSun" viewBox="0 0 16 16">
|
||||||
|
<path d="M8 11.5a3.5 3.5 0 1 1 0-7 3.5 3.5 0 0 1 0 7zm0 1.5a5 5 0 1 0 0-10 5 5 0 0 0 0 10zM8 0a.5.5 0 0 1 .5.5v1.555a.5.5 0 0 1-1 0V.5A.5.5 0 0 1 8 0zm0 12.945a.5.5 0 0 1 .5.5v2.055a.5.5 0 0 1-1 0v-2.055a.5.5 0 0 1 .5-.5zM2.343 2.343a.5.5 0 0 1 .707 0l1.1 1.1a.5.5 0 1 1-.708.707l-1.1-1.1a.5.5 0 0 1 0-.707zm9.507 9.507a.5.5 0 0 1 .707 0l1.1 1.1a.5.5 0 1 1-.707.708l-1.1-1.1a.5.5 0 0 1 0-.708zM0 8a.5.5 0 0 1 .5-.5h1.555a.5.5 0 0 1 0 1H.5A.5.5 0 0 1 0 8zm12.945 0a.5.5 0 0 1 .5-.5H15.5a.5.5 0 0 1 0 1h-2.055a.5.5 0 0 1-.5-.5zM2.343 13.657a.5.5 0 0 1 0-.707l1.1-1.1a.5.5 0 1 1 .708.707l-1.1 1.1a.5.5 0 0 1-.708 0zm9.507-9.507a.5.5 0 0 1 0-.707l1.1-1.1a.5.5 0 0 1 .707.708l-1.1 1.1a.5.5 0 0 1-.707 0z"/>
|
||||||
|
</svg>
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="18" height="18" fill="currentColor" class="theme-icon" id="themeToggleMoon" viewBox="0 0 16 16">
|
||||||
|
<path d="M6 .278a.768.768 0 0 1 .08.858 7.208 7.208 0 0 0-.878 3.46c0 4.021 3.278 7.277 7.318 7.277.527 0 1.04-.055 1.533-.16a.787.787 0 0 1 .81.316.733.733 0 0 1-.031.893A8.349 8.349 0 0 1 8.344 16C3.734 16 0 12.286 0 7.71 0 4.266 2.114 1.312 5.124.06A.752.752 0 0 1 6 .278z"/>
|
||||||
|
<path d="M10.794 3.148a.217.217 0 0 1 .412 0l.387 1.162c.173.518.579.924 1.097 1.097l1.162.387a.217.217 0 0 1 0 .412l-1.162.387a1.734 1.734 0 0 0-1.097 1.097l-.387 1.162a.217.217 0 0 1-.412 0l-.387-1.162A1.734 1.734 0 0 0 9.31 6.593l-1.162-.387a.217.217 0 0 1 0-.412l1.162-.387a1.734 1.734 0 0 0 1.097-1.097l.387-1.162zM13.863.099a.145.145 0 0 1 .274 0l.258.774c.115.346.386.617.732.732l.774.258a.145.145 0 0 1 0 .274l-.774.258a1.156 1.156 0 0 0-.732.732l-.258.774a.145.145 0 0 1-.274 0l-.258-.774a1.156 1.156 0 0 0-.732-.732l-.774-.258a.145.145 0 0 1 0-.274l.774-.258c.346-.115.617-.386.732-.732L13.863.1z"/>
|
||||||
|
</svg>
|
||||||
|
<span class="theme-toggle-text">Toggle theme</span>
|
||||||
|
</button>
|
||||||
|
{% if principal %}
|
||||||
|
<div class="sidebar-user" data-username="{{ principal.display_name }}">
|
||||||
|
<div class="user-avatar">
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="18" height="18" fill="currentColor" viewBox="0 0 16 16">
|
||||||
|
<path d="M11 6a3 3 0 1 1-6 0 3 3 0 0 1 6 0z"/>
|
||||||
|
<path fill-rule="evenodd" d="M0 8a8 8 0 1 1 16 0A8 8 0 0 1 0 8zm8-7a7 7 0 0 0-5.468 11.37C3.242 11.226 4.805 10 8 10s4.757 1.225 5.468 2.37A7 7 0 0 0 8 1z"/>
|
||||||
|
</svg>
|
||||||
|
</div>
|
||||||
|
<div class="user-info">
|
||||||
|
<div class="user-name" title="{{ principal.display_name }}">{{ principal.display_name | truncate(16, true) }}</div>
|
||||||
|
<div class="user-key">{{ principal.access_key | truncate(12, true) }}</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<form method="post" action="{{ url_for('ui.logout') }}" class="w-100">
|
||||||
|
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}" />
|
||||||
|
<button class="sidebar-logout-btn" type="submit">
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="18" height="18" fill="currentColor" viewBox="0 0 16 16">
|
||||||
|
<path fill-rule="evenodd" d="M10 12.5a.5.5 0 0 1-.5.5h-8a.5.5 0 0 1-.5-.5v-9a.5.5 0 0 1 .5-.5h8a.5.5 0 0 1 .5.5v2a.5.5 0 0 0 1 0v-2A1.5 1.5 0 0 0 9.5 2h-8A1.5 1.5 0 0 0 0 3.5v9A1.5 1.5 0 0 0 1.5 14h8a1.5 1.5 0 0 0 1.5-1.5v-2a.5.5 0 0 0-1 0v2z"/>
|
||||||
|
<path fill-rule="evenodd" d="M15.854 8.354a.5.5 0 0 0 0-.708l-3-3a.5.5 0 0 0-.708.708L14.293 7.5H5.5a.5.5 0 0 0 0 1h8.793l-2.147 2.146a.5.5 0 0 0 .708.708l3-3z"/>
|
||||||
|
</svg>
|
||||||
|
<span class="logout-text">Sign out</span>
|
||||||
|
</button>
|
||||||
|
</form>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
</aside>
|
||||||
|
|
||||||
|
<div class="main-wrapper">
|
||||||
|
<main class="main-content">
|
||||||
|
{% block content %}{% endblock %}
|
||||||
|
</main>
|
||||||
|
</div>
|
||||||
<div class="toast-container position-fixed bottom-0 end-0 p-3">
|
<div class="toast-container position-fixed bottom-0 end-0 p-3">
|
||||||
<div id="liveToast" class="toast" role="alert" aria-live="assertive" aria-atomic="true">
|
<div id="liveToast" class="toast" role="alert" aria-live="assertive" aria-atomic="true">
|
||||||
<div class="toast-header">
|
<div class="toast-header">
|
||||||
@@ -162,9 +275,11 @@
|
|||||||
(function () {
|
(function () {
|
||||||
const storageKey = 'myfsio-theme';
|
const storageKey = 'myfsio-theme';
|
||||||
const toggle = document.getElementById('themeToggle');
|
const toggle = document.getElementById('themeToggle');
|
||||||
const label = document.getElementById('themeToggleLabel');
|
const toggleMobile = document.getElementById('themeToggleMobile');
|
||||||
const sunIcon = document.getElementById('themeToggleSun');
|
const sunIcon = document.getElementById('themeToggleSun');
|
||||||
const moonIcon = document.getElementById('themeToggleMoon');
|
const moonIcon = document.getElementById('themeToggleMoon');
|
||||||
|
const sunIconMobile = document.getElementById('themeToggleSunMobile');
|
||||||
|
const moonIconMobile = document.getElementById('themeToggleMoonMobile');
|
||||||
|
|
||||||
const applyTheme = (theme) => {
|
const applyTheme = (theme) => {
|
||||||
document.documentElement.dataset.bsTheme = theme;
|
document.documentElement.dataset.bsTheme = theme;
|
||||||
@@ -172,29 +287,74 @@
|
|||||||
try {
|
try {
|
||||||
localStorage.setItem(storageKey, theme);
|
localStorage.setItem(storageKey, theme);
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
/* localStorage unavailable */
|
console.log("Error: local storage not available, cannot save theme preference.");
|
||||||
}
|
|
||||||
if (label) {
|
|
||||||
label.textContent = theme === 'dark' ? 'Switch to light mode' : 'Switch to dark mode';
|
|
||||||
}
|
|
||||||
if (toggle) {
|
|
||||||
toggle.setAttribute('aria-pressed', theme === 'dark' ? 'true' : 'false');
|
|
||||||
toggle.setAttribute('title', theme === 'dark' ? 'Switch to light mode' : 'Switch to dark mode');
|
|
||||||
toggle.setAttribute('aria-label', theme === 'dark' ? 'Switch to light mode' : 'Switch to dark mode');
|
|
||||||
}
|
}
|
||||||
|
const isDark = theme === 'dark';
|
||||||
if (sunIcon && moonIcon) {
|
if (sunIcon && moonIcon) {
|
||||||
const isDark = theme === 'dark';
|
|
||||||
sunIcon.classList.toggle('d-none', !isDark);
|
sunIcon.classList.toggle('d-none', !isDark);
|
||||||
moonIcon.classList.toggle('d-none', isDark);
|
moonIcon.classList.toggle('d-none', isDark);
|
||||||
}
|
}
|
||||||
|
if (sunIconMobile && moonIconMobile) {
|
||||||
|
sunIconMobile.classList.toggle('d-none', !isDark);
|
||||||
|
moonIconMobile.classList.toggle('d-none', isDark);
|
||||||
|
}
|
||||||
|
[toggle, toggleMobile].forEach(btn => {
|
||||||
|
if (btn) {
|
||||||
|
btn.setAttribute('aria-pressed', isDark ? 'true' : 'false');
|
||||||
|
btn.setAttribute('title', isDark ? 'Switch to light mode' : 'Switch to dark mode');
|
||||||
|
btn.setAttribute('aria-label', isDark ? 'Switch to light mode' : 'Switch to dark mode');
|
||||||
|
}
|
||||||
|
});
|
||||||
};
|
};
|
||||||
|
|
||||||
const current = document.documentElement.dataset.bsTheme || 'light';
|
const current = document.documentElement.dataset.bsTheme || 'light';
|
||||||
applyTheme(current);
|
applyTheme(current);
|
||||||
|
|
||||||
toggle?.addEventListener('click', () => {
|
const handleToggle = () => {
|
||||||
const next = document.documentElement.dataset.bsTheme === 'dark' ? 'light' : 'dark';
|
const next = document.documentElement.dataset.bsTheme === 'dark' ? 'light' : 'dark';
|
||||||
applyTheme(next);
|
applyTheme(next);
|
||||||
|
};
|
||||||
|
|
||||||
|
toggle?.addEventListener('click', handleToggle);
|
||||||
|
toggleMobile?.addEventListener('click', handleToggle);
|
||||||
|
})();
|
||||||
|
</script>
|
||||||
|
<script>
|
||||||
|
(function () {
|
||||||
|
const sidebar = document.getElementById('desktopSidebar');
|
||||||
|
const collapseBtn = document.getElementById('sidebarCollapseBtn');
|
||||||
|
const sidebarBrand = document.getElementById('sidebarBrand');
|
||||||
|
const storageKey = 'myfsio-sidebar-collapsed';
|
||||||
|
|
||||||
|
if (!sidebar || !collapseBtn) return;
|
||||||
|
|
||||||
|
const applyCollapsed = (collapsed) => {
|
||||||
|
sidebar.classList.toggle('sidebar-collapsed', collapsed);
|
||||||
|
document.body.classList.toggle('sidebar-is-collapsed', collapsed);
|
||||||
|
document.documentElement.classList.remove('sidebar-will-collapse');
|
||||||
|
try {
|
||||||
|
localStorage.setItem(storageKey, collapsed ? 'true' : 'false');
|
||||||
|
} catch (err) {}
|
||||||
|
};
|
||||||
|
|
||||||
|
try {
|
||||||
|
const stored = localStorage.getItem(storageKey);
|
||||||
|
applyCollapsed(stored === 'true');
|
||||||
|
} catch (err) {
|
||||||
|
document.documentElement.classList.remove('sidebar-will-collapse');
|
||||||
|
}
|
||||||
|
|
||||||
|
collapseBtn.addEventListener('click', () => {
|
||||||
|
const isCollapsed = sidebar.classList.contains('sidebar-collapsed');
|
||||||
|
applyCollapsed(!isCollapsed);
|
||||||
|
});
|
||||||
|
|
||||||
|
sidebarBrand?.addEventListener('click', (e) => {
|
||||||
|
const isCollapsed = sidebar.classList.contains('sidebar-collapsed');
|
||||||
|
if (isCollapsed) {
|
||||||
|
e.preventDefault();
|
||||||
|
applyCollapsed(false);
|
||||||
|
}
|
||||||
});
|
});
|
||||||
})();
|
})();
|
||||||
</script>
|
</script>
|
||||||
@@ -233,6 +393,8 @@
|
|||||||
{% endwith %}
|
{% endwith %}
|
||||||
})();
|
})();
|
||||||
</script>
|
</script>
|
||||||
|
<script src="{{ url_for('static', filename='js/ui-core.js') }}"></script>
|
||||||
{% block extra_scripts %}{% endblock %}
|
{% block extra_scripts %}{% endblock %}
|
||||||
|
|
||||||
</body>
|
</body>
|
||||||
</html>
|
</html>
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -104,7 +104,7 @@
|
|||||||
</h1>
|
</h1>
|
||||||
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||||
</div>
|
</div>
|
||||||
<form method="post" action="{{ url_for('ui.create_bucket') }}">
|
<form method="post" action="{{ url_for('ui.create_bucket') }}" id="createBucketForm">
|
||||||
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}" />
|
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}" />
|
||||||
<div class="modal-body pt-0">
|
<div class="modal-body pt-0">
|
||||||
<label class="form-label fw-medium">Bucket name</label>
|
<label class="form-label fw-medium">Bucket name</label>
|
||||||
@@ -205,6 +205,25 @@
|
|||||||
});
|
});
|
||||||
row.style.cursor = 'pointer';
|
row.style.cursor = 'pointer';
|
||||||
});
|
});
|
||||||
|
|
||||||
|
var createForm = document.getElementById('createBucketForm');
|
||||||
|
if (createForm) {
|
||||||
|
createForm.addEventListener('submit', function(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
window.UICore.submitFormAjax(createForm, {
|
||||||
|
successMessage: 'Bucket created',
|
||||||
|
onSuccess: function(data) {
|
||||||
|
var modal = bootstrap.Modal.getInstance(document.getElementById('createBucketModal'));
|
||||||
|
if (modal) modal.hide();
|
||||||
|
if (data.bucket_name) {
|
||||||
|
window.location.href = '{{ url_for("ui.bucket_detail", bucket_name="__BUCKET__") }}'.replace('__BUCKET__', data.bucket_name);
|
||||||
|
} else {
|
||||||
|
location.reload();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
})();
|
})();
|
||||||
</script>
|
</script>
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|||||||
@@ -57,7 +57,7 @@
|
|||||||
<label for="secret_key" class="form-label fw-medium">Secret Key</label>
|
<label for="secret_key" class="form-label fw-medium">Secret Key</label>
|
||||||
<div class="input-group">
|
<div class="input-group">
|
||||||
<input type="password" class="form-control font-monospace" id="secret_key" name="secret_key" required>
|
<input type="password" class="form-control font-monospace" id="secret_key" name="secret_key" required>
|
||||||
<button class="btn btn-outline-secondary" type="button" onclick="togglePassword('secret_key')" title="Toggle visibility">
|
<button class="btn btn-outline-secondary" type="button" onclick="ConnectionsManagement.togglePassword('secret_key')" title="Toggle visibility">
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" viewBox="0 0 16 16">
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" viewBox="0 0 16 16">
|
||||||
<path d="M16 8s-3-5.5-8-5.5S0 8 0 8s3 5.5 8 5.5S16 8 16 8zM1.173 8a13.133 13.133 0 0 1 1.66-2.043C4.12 4.668 5.88 3.5 8 3.5c2.12 0 3.879 1.168 5.168 2.457A13.133 13.133 0 0 1 14.828 8c-.058.087-.122.183-.195.288-.335.48-.83 1.12-1.465 1.755C11.879 11.332 10.119 12.5 8 12.5c-2.12 0-3.879-1.168-5.168-2.457A13.134 13.134 0 0 1 1.172 8z"/>
|
<path d="M16 8s-3-5.5-8-5.5S0 8 0 8s3 5.5 8 5.5S16 8 16 8zM1.173 8a13.133 13.133 0 0 1 1.66-2.043C4.12 4.668 5.88 3.5 8 3.5c2.12 0 3.879 1.168 5.168 2.457A13.133 13.133 0 0 1 14.828 8c-.058.087-.122.183-.195.288-.335.48-.83 1.12-1.465 1.755C11.879 11.332 10.119 12.5 8 12.5c-2.12 0-3.879-1.168-5.168-2.457A13.134 13.134 0 0 1 1.172 8z"/>
|
||||||
<path d="M8 5.5a2.5 2.5 0 1 0 0 5 2.5 2.5 0 0 0 0-5zM4.5 8a3.5 3.5 0 1 1 7 0 3.5 3.5 0 0 1-7 0z"/>
|
<path d="M8 5.5a2.5 2.5 0 1 0 0 5 2.5 2.5 0 0 0 0-5zM4.5 8a3.5 3.5 0 1 1 7 0 3.5 3.5 0 0 1-7 0z"/>
|
||||||
@@ -220,7 +220,7 @@
|
|||||||
<label for="edit_secret_key" class="form-label fw-medium">Secret Key</label>
|
<label for="edit_secret_key" class="form-label fw-medium">Secret Key</label>
|
||||||
<div class="input-group">
|
<div class="input-group">
|
||||||
<input type="password" class="form-control font-monospace" id="edit_secret_key" name="secret_key" required>
|
<input type="password" class="form-control font-monospace" id="edit_secret_key" name="secret_key" required>
|
||||||
<button class="btn btn-outline-secondary" type="button" onclick="togglePassword('edit_secret_key')">
|
<button class="btn btn-outline-secondary" type="button" onclick="ConnectionsManagement.togglePassword('edit_secret_key')">
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" viewBox="0 0 16 16">
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" viewBox="0 0 16 16">
|
||||||
<path d="M16 8s-3-5.5-8-5.5S0 8 0 8s3 5.5 8 5.5S16 8 16 8zM1.173 8a13.133 13.133 0 0 1 1.66-2.043C4.12 4.668 5.88 3.5 8 3.5c2.12 0 3.879 1.168 5.168 2.457A13.133 13.133 0 0 1 14.828 8c-.058.087-.122.183-.195.288-.335.48-.83 1.12-1.465 1.755C11.879 11.332 10.119 12.5 8 12.5c-2.12 0-3.879-1.168-5.168-2.457A13.134 13.134 0 0 1 1.172 8z"/>
|
<path d="M16 8s-3-5.5-8-5.5S0 8 0 8s3 5.5 8 5.5S16 8 16 8zM1.173 8a13.133 13.133 0 0 1 1.66-2.043C4.12 4.668 5.88 3.5 8 3.5c2.12 0 3.879 1.168 5.168 2.457A13.133 13.133 0 0 1 14.828 8c-.058.087-.122.183-.195.288-.335.48-.83 1.12-1.465 1.755C11.879 11.332 10.119 12.5 8 12.5c-2.12 0-3.879-1.168-5.168-2.457A13.134 13.134 0 0 1 1.172 8z"/>
|
||||||
<path d="M8 5.5a2.5 2.5 0 1 0 0 5 2.5 2.5 0 0 0 0-5zM4.5 8a3.5 3.5 0 1 1 7 0 3.5 3.5 0 0 1-7 0z"/>
|
<path d="M8 5.5a2.5 2.5 0 1 0 0 5 2.5 2.5 0 0 0 0-5zM4.5 8a3.5 3.5 0 1 1 7 0 3.5 3.5 0 0 1-7 0z"/>
|
||||||
@@ -289,153 +289,16 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<script src="{{ url_for('static', filename='js/connections-management.js') }}"></script>
|
||||||
<script>
|
<script>
|
||||||
function togglePassword(id) {
|
ConnectionsManagement.init({
|
||||||
const input = document.getElementById(id);
|
csrfToken: "{{ csrf_token() }}",
|
||||||
if (input.type === "password") {
|
endpoints: {
|
||||||
input.type = "text";
|
test: "{{ url_for('ui.test_connection') }}",
|
||||||
} else {
|
updateTemplate: "{{ url_for('ui.update_connection', connection_id='CONNECTION_ID') }}",
|
||||||
input.type = "password";
|
deleteTemplate: "{{ url_for('ui.delete_connection', connection_id='CONNECTION_ID') }}",
|
||||||
}
|
healthTemplate: "/ui/connections/CONNECTION_ID/health"
|
||||||
}
|
}
|
||||||
|
});
|
||||||
async function testConnection(formId, resultId) {
|
|
||||||
const form = document.getElementById(formId);
|
|
||||||
const resultDiv = document.getElementById(resultId);
|
|
||||||
const formData = new FormData(form);
|
|
||||||
const data = Object.fromEntries(formData.entries());
|
|
||||||
|
|
||||||
resultDiv.innerHTML = '<div class="text-info"><span class="spinner-border spinner-border-sm" role="status" aria-hidden="true"></span> Testing connection...</div>';
|
|
||||||
|
|
||||||
const controller = new AbortController();
|
|
||||||
const timeoutId = setTimeout(() => controller.abort(), 20000);
|
|
||||||
|
|
||||||
try {
|
|
||||||
const response = await fetch("{{ url_for('ui.test_connection') }}", {
|
|
||||||
method: "POST",
|
|
||||||
headers: {
|
|
||||||
"Content-Type": "application/json",
|
|
||||||
"X-CSRFToken": "{{ csrf_token() }}"
|
|
||||||
},
|
|
||||||
body: JSON.stringify(data),
|
|
||||||
signal: controller.signal
|
|
||||||
});
|
|
||||||
clearTimeout(timeoutId);
|
|
||||||
|
|
||||||
const result = await response.json();
|
|
||||||
if (response.ok) {
|
|
||||||
resultDiv.innerHTML = `<div class="text-success">
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" class="me-1" viewBox="0 0 16 16">
|
|
||||||
<path d="M16 8A8 8 0 1 1 0 8a8 8 0 0 1 16 0zm-3.97-3.03a.75.75 0 0 0-1.08.022L7.477 9.417 5.384 7.323a.75.75 0 0 0-1.06 1.06L6.97 11.03a.75.75 0 0 0 1.079-.02l3.992-4.99a.75.75 0 0 0-.01-1.05z"/>
|
|
||||||
</svg>
|
|
||||||
${result.message}
|
|
||||||
</div>`;
|
|
||||||
} else {
|
|
||||||
resultDiv.innerHTML = `<div class="text-danger">
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" class="me-1" viewBox="0 0 16 16">
|
|
||||||
<path d="M16 8A8 8 0 1 1 0 8a8 8 0 0 1 16 0zM5.354 4.646a.5.5 0 1 0-.708.708L7.293 8l-2.647 2.646a.5.5 0 0 0 .708.708L8 8.707l2.646 2.647a.5.5 0 0 0 .708-.708L8.707 8l2.647-2.646a.5.5 0 0 0-.708-.708L8 7.293 5.354 4.646z"/>
|
|
||||||
</svg>
|
|
||||||
${result.message}
|
|
||||||
</div>`;
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
clearTimeout(timeoutId);
|
|
||||||
if (error.name === 'AbortError') {
|
|
||||||
resultDiv.innerHTML = `<div class="text-danger">
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" class="me-1" viewBox="0 0 16 16">
|
|
||||||
<path d="M16 8A8 8 0 1 1 0 8a8 8 0 0 1 16 0zM5.354 4.646a.5.5 0 1 0-.708.708L7.293 8l-2.647 2.646a.5.5 0 0 0 .708.708L8 8.707l2.646 2.647a.5.5 0 0 0 .708-.708L8.707 8l2.647-2.646a.5.5 0 0 0-.708-.708L8 7.293 5.354 4.646z"/>
|
|
||||||
</svg>
|
|
||||||
Connection test timed out - endpoint may be unreachable
|
|
||||||
</div>`;
|
|
||||||
} else {
|
|
||||||
resultDiv.innerHTML = `<div class="text-danger">
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" class="me-1" viewBox="0 0 16 16">
|
|
||||||
<path d="M16 8A8 8 0 1 1 0 8a8 8 0 0 1 16 0zM5.354 4.646a.5.5 0 1 0-.708.708L7.293 8l-2.647 2.646a.5.5 0 0 0 .708.708L8 8.707l2.646 2.647a.5.5 0 0 0 .708-.708L8.707 8l2.647-2.646a.5.5 0 0 0-.708-.708L8 7.293 5.354 4.646z"/>
|
|
||||||
</svg>
|
|
||||||
Connection failed: Network error
|
|
||||||
</div>`;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
document.getElementById('testConnectionBtn').addEventListener('click', () => {
|
|
||||||
testConnection('createConnectionForm', 'testResult');
|
|
||||||
});
|
|
||||||
|
|
||||||
document.getElementById('editTestConnectionBtn').addEventListener('click', () => {
|
|
||||||
testConnection('editConnectionForm', 'editTestResult');
|
|
||||||
});
|
|
||||||
|
|
||||||
const editModal = document.getElementById('editConnectionModal');
|
|
||||||
editModal.addEventListener('show.bs.modal', event => {
|
|
||||||
const button = event.relatedTarget;
|
|
||||||
const id = button.getAttribute('data-id');
|
|
||||||
|
|
||||||
document.getElementById('edit_name').value = button.getAttribute('data-name');
|
|
||||||
document.getElementById('edit_endpoint_url').value = button.getAttribute('data-endpoint');
|
|
||||||
document.getElementById('edit_region').value = button.getAttribute('data-region');
|
|
||||||
document.getElementById('edit_access_key').value = button.getAttribute('data-access');
|
|
||||||
document.getElementById('edit_secret_key').value = button.getAttribute('data-secret');
|
|
||||||
document.getElementById('editTestResult').innerHTML = '';
|
|
||||||
|
|
||||||
const form = document.getElementById('editConnectionForm');
|
|
||||||
form.action = "{{ url_for('ui.update_connection', connection_id='CONN_ID') }}".replace('CONN_ID', id);
|
|
||||||
});
|
|
||||||
|
|
||||||
const deleteModal = document.getElementById('deleteConnectionModal');
|
|
||||||
deleteModal.addEventListener('show.bs.modal', event => {
|
|
||||||
const button = event.relatedTarget;
|
|
||||||
const id = button.getAttribute('data-id');
|
|
||||||
const name = button.getAttribute('data-name');
|
|
||||||
|
|
||||||
document.getElementById('deleteConnectionName').textContent = name;
|
|
||||||
const form = document.getElementById('deleteConnectionForm');
|
|
||||||
form.action = "{{ url_for('ui.delete_connection', connection_id='CONN_ID') }}".replace('CONN_ID', id);
|
|
||||||
});
|
|
||||||
|
|
||||||
async function checkConnectionHealth(connectionId, statusEl) {
|
|
||||||
try {
|
|
||||||
const controller = new AbortController();
|
|
||||||
const timeoutId = setTimeout(() => controller.abort(), 15000);
|
|
||||||
|
|
||||||
const response = await fetch(`/ui/connections/${connectionId}/health`, {
|
|
||||||
signal: controller.signal
|
|
||||||
});
|
|
||||||
clearTimeout(timeoutId);
|
|
||||||
|
|
||||||
const data = await response.json();
|
|
||||||
if (data.healthy) {
|
|
||||||
statusEl.innerHTML = `
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" class="text-success" viewBox="0 0 16 16">
|
|
||||||
<path d="M16 8A8 8 0 1 1 0 8a8 8 0 0 1 16 0zm-3.97-3.03a.75.75 0 0 0-1.08.022L7.477 9.417 5.384 7.323a.75.75 0 0 0-1.06 1.06L6.97 11.03a.75.75 0 0 0 1.079-.02l3.992-4.99a.75.75 0 0 0-.01-1.05z"/>
|
|
||||||
</svg>`;
|
|
||||||
statusEl.setAttribute('data-status', 'healthy');
|
|
||||||
statusEl.setAttribute('title', 'Connected');
|
|
||||||
} else {
|
|
||||||
statusEl.innerHTML = `
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" class="text-danger" viewBox="0 0 16 16">
|
|
||||||
<path d="M16 8A8 8 0 1 1 0 8a8 8 0 0 1 16 0zM5.354 4.646a.5.5 0 1 0-.708.708L7.293 8l-2.647 2.646a.5.5 0 0 0 .708.708L8 8.707l2.646 2.647a.5.5 0 0 0 .708-.708L8.707 8l2.647-2.646a.5.5 0 0 0-.708-.708L8 7.293 5.354 4.646z"/>
|
|
||||||
</svg>`;
|
|
||||||
statusEl.setAttribute('data-status', 'unhealthy');
|
|
||||||
statusEl.setAttribute('title', data.error || 'Unreachable');
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
statusEl.innerHTML = `
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" class="text-warning" viewBox="0 0 16 16">
|
|
||||||
<path d="M8.982 1.566a1.13 1.13 0 0 0-1.96 0L.165 13.233c-.457.778.091 1.767.98 1.767h13.713c.889 0 1.438-.99.98-1.767L8.982 1.566zM8 5c.535 0 .954.462.9.995l-.35 3.507a.552.552 0 0 1-1.1 0L7.1 5.995A.905.905 0 0 1 8 5zm.002 6a1 1 0 1 1 0 2 1 1 0 0 1 0-2z"/>
|
|
||||||
</svg>`;
|
|
||||||
statusEl.setAttribute('data-status', 'unknown');
|
|
||||||
statusEl.setAttribute('title', 'Could not check status');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const connectionRows = document.querySelectorAll('tr[data-connection-id]');
|
|
||||||
connectionRows.forEach((row, index) => {
|
|
||||||
const connectionId = row.getAttribute('data-connection-id');
|
|
||||||
const statusEl = row.querySelector('.connection-status');
|
|
||||||
if (statusEl) {
|
|
||||||
setTimeout(() => checkConnectionHealth(connectionId, statusEl), index * 200);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
</script>
|
</script>
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|||||||
@@ -38,6 +38,7 @@
|
|||||||
<li><a href="#versioning">Object Versioning</a></li>
|
<li><a href="#versioning">Object Versioning</a></li>
|
||||||
<li><a href="#quotas">Bucket Quotas</a></li>
|
<li><a href="#quotas">Bucket Quotas</a></li>
|
||||||
<li><a href="#encryption">Encryption</a></li>
|
<li><a href="#encryption">Encryption</a></li>
|
||||||
|
<li><a href="#lifecycle">Lifecycle Rules</a></li>
|
||||||
<li><a href="#troubleshooting">Troubleshooting</a></li>
|
<li><a href="#troubleshooting">Troubleshooting</a></li>
|
||||||
</ul>
|
</ul>
|
||||||
</div>
|
</div>
|
||||||
@@ -606,11 +607,49 @@ except Exception as e:
|
|||||||
<li>Follow the steps above to replicate <strong>A → B</strong>.</li>
|
<li>Follow the steps above to replicate <strong>A → B</strong>.</li>
|
||||||
<li>Repeat the process on Server B to replicate <strong>B → A</strong> (create a connection to A, enable rule).</li>
|
<li>Repeat the process on Server B to replicate <strong>B → A</strong> (create a connection to A, enable rule).</li>
|
||||||
</ol>
|
</ol>
|
||||||
<p class="small text-muted mb-0">
|
<p class="small text-muted mb-3">
|
||||||
<strong>Loop Prevention:</strong> The system automatically detects replication traffic using a custom User-Agent (<code>S3ReplicationAgent</code>). This prevents infinite loops where an object replicated from A to B is immediately replicated back to A.
|
<strong>Loop Prevention:</strong> The system automatically detects replication traffic using a custom User-Agent (<code>S3ReplicationAgent</code>). This prevents infinite loops where an object replicated from A to B is immediately replicated back to A.
|
||||||
<br>
|
<br>
|
||||||
<strong>Deletes:</strong> Deleting an object on one server will propagate the deletion to the other server.
|
<strong>Deletes:</strong> Deleting an object on one server will propagate the deletion to the other server.
|
||||||
</p>
|
</p>
|
||||||
|
|
||||||
|
<h3 class="h6 text-uppercase text-muted mt-4">Error Handling & Rate Limits</h3>
|
||||||
|
<p class="small text-muted mb-3">The replication system handles transient failures automatically:</p>
|
||||||
|
<div class="table-responsive mb-3">
|
||||||
|
<table class="table table-sm table-bordered small">
|
||||||
|
<thead class="table-light">
|
||||||
|
<tr>
|
||||||
|
<th>Behavior</th>
|
||||||
|
<th>Details</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody>
|
||||||
|
<tr>
|
||||||
|
<td><strong>Retry Logic</strong></td>
|
||||||
|
<td>boto3 automatically handles 429 (rate limit) errors using exponential backoff with <code>max_attempts=2</code></td>
|
||||||
|
</tr>
|
||||||
|
<tr>
|
||||||
|
<td><strong>Concurrency</strong></td>
|
||||||
|
<td>Uses a ThreadPoolExecutor with 4 parallel workers for replication tasks</td>
|
||||||
|
</tr>
|
||||||
|
<tr>
|
||||||
|
<td><strong>Timeouts</strong></td>
|
||||||
|
<td>Connect: 5s, Read: 30s. Large files use streaming transfers</td>
|
||||||
|
</tr>
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
<div class="alert alert-warning border mb-0">
|
||||||
|
<div class="d-flex gap-2">
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" class="bi bi-exclamation-triangle text-warning mt-1 flex-shrink-0" viewBox="0 0 16 16">
|
||||||
|
<path d="M7.938 2.016A.13.13 0 0 1 8.002 2a.13.13 0 0 1 .063.016.146.146 0 0 1 .054.057l6.857 11.667c.036.06.035.124.002.183a.163.163 0 0 1-.054.06.116.116 0 0 1-.066.017H1.146a.115.115 0 0 1-.066-.017.163.163 0 0 1-.054-.06.176.176 0 0 1 .002-.183L7.884 2.073a.147.147 0 0 1 .054-.057zm1.044-.45a1.13 1.13 0 0 0-1.96 0L.165 13.233c-.457.778.091 1.767.98 1.767h13.713c.889 0 1.438-.99.98-1.767L8.982 1.566z"/>
|
||||||
|
<path d="M7.002 12a1 1 0 1 1 2 0 1 1 0 0 1-2 0zM7.1 5.995a.905.905 0 1 1 1.8 0l-.35 3.507a.552.552 0 0 1-1.1 0L7.1 5.995z"/>
|
||||||
|
</svg>
|
||||||
|
<div>
|
||||||
|
<strong>Large File Counts:</strong> When replicating buckets with many objects, the target server's rate limits may cause delays. There is no built-in pause mechanism. Consider increasing <code>RATE_LIMIT_DEFAULT</code> on the target server during bulk replication operations.
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</article>
|
</article>
|
||||||
<article id="versioning" class="card shadow-sm docs-section">
|
<article id="versioning" class="card shadow-sm docs-section">
|
||||||
@@ -855,10 +894,92 @@ curl -X DELETE "{{ api_base }}/kms/keys/{key-id}?waiting_period_days=30" \
|
|||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
</article>
|
</article>
|
||||||
<article id="troubleshooting" class="card shadow-sm docs-section">
|
<article id="lifecycle" class="card shadow-sm docs-section">
|
||||||
<div class="card-body">
|
<div class="card-body">
|
||||||
<div class="d-flex align-items-center gap-2 mb-3">
|
<div class="d-flex align-items-center gap-2 mb-3">
|
||||||
<span class="docs-section-kicker">12</span>
|
<span class="docs-section-kicker">12</span>
|
||||||
|
<h2 class="h4 mb-0">Lifecycle Rules</h2>
|
||||||
|
</div>
|
||||||
|
<p class="text-muted">Automatically delete expired objects, clean up old versions, and abort incomplete multipart uploads using time-based lifecycle rules.</p>
|
||||||
|
|
||||||
|
<h3 class="h6 text-uppercase text-muted mt-4">How It Works</h3>
|
||||||
|
<p class="small text-muted mb-3">
|
||||||
|
Lifecycle rules run on a background timer (Python <code>threading.Timer</code>), not a system cronjob. The enforcement cycle triggers every <strong>3600 seconds (1 hour)</strong> by default. Each cycle scans all buckets with lifecycle configurations and applies matching rules.
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<h3 class="h6 text-uppercase text-muted mt-4">Expiration Types</h3>
|
||||||
|
<div class="table-responsive mb-3">
|
||||||
|
<table class="table table-sm table-bordered small">
|
||||||
|
<thead class="table-light">
|
||||||
|
<tr>
|
||||||
|
<th>Type</th>
|
||||||
|
<th>Description</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody>
|
||||||
|
<tr>
|
||||||
|
<td><strong>Expiration (Days)</strong></td>
|
||||||
|
<td>Delete current objects older than N days from their last modification</td>
|
||||||
|
</tr>
|
||||||
|
<tr>
|
||||||
|
<td><strong>Expiration (Date)</strong></td>
|
||||||
|
<td>Delete current objects after a specific date (ISO 8601 format)</td>
|
||||||
|
</tr>
|
||||||
|
<tr>
|
||||||
|
<td><strong>NoncurrentVersionExpiration</strong></td>
|
||||||
|
<td>Delete non-current (archived) versions older than N days from when they became non-current</td>
|
||||||
|
</tr>
|
||||||
|
<tr>
|
||||||
|
<td><strong>AbortIncompleteMultipartUpload</strong></td>
|
||||||
|
<td>Abort multipart uploads that have been in progress longer than N days</td>
|
||||||
|
</tr>
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<h3 class="h6 text-uppercase text-muted mt-4">API Usage</h3>
|
||||||
|
<pre class="mb-3"><code class="language-bash"># Set lifecycle rule (delete objects older than 30 days)
|
||||||
|
curl -X PUT "{{ api_base }}/<bucket>?lifecycle" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-H "X-Access-Key: <key>" -H "X-Secret-Key: <secret>" \
|
||||||
|
-d '[{
|
||||||
|
"ID": "expire-old-objects",
|
||||||
|
"Status": "Enabled",
|
||||||
|
"Prefix": "",
|
||||||
|
"Expiration": {"Days": 30}
|
||||||
|
}]'
|
||||||
|
|
||||||
|
# Abort incomplete multipart uploads after 7 days
|
||||||
|
curl -X PUT "{{ api_base }}/<bucket>?lifecycle" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-H "X-Access-Key: <key>" -H "X-Secret-Key: <secret>" \
|
||||||
|
-d '[{
|
||||||
|
"ID": "cleanup-multipart",
|
||||||
|
"Status": "Enabled",
|
||||||
|
"AbortIncompleteMultipartUpload": {"DaysAfterInitiation": 7}
|
||||||
|
}]'
|
||||||
|
|
||||||
|
# Get current lifecycle configuration
|
||||||
|
curl "{{ api_base }}/<bucket>?lifecycle" \
|
||||||
|
-H "X-Access-Key: <key>" -H "X-Secret-Key: <secret>"</code></pre>
|
||||||
|
|
||||||
|
<div class="alert alert-light border mb-0">
|
||||||
|
<div class="d-flex gap-2">
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" class="bi bi-info-circle text-muted mt-1 flex-shrink-0" viewBox="0 0 16 16">
|
||||||
|
<path d="M8 15A7 7 0 1 1 8 1a7 7 0 0 1 0 14zm0 1A8 8 0 1 0 8 0a8 8 0 0 0 0 16z"/>
|
||||||
|
<path d="m8.93 6.588-2.29.287-.082.38.45.083c.294.07.352.176.288.469l-.738 3.468c-.194.897.105 1.319.808 1.319.545 0 1.178-.252 1.465-.598l.088-.416c-.2.176-.492.246-.686.246-.275 0-.375-.193-.304-.533L8.93 6.588zM9 4.5a1 1 0 1 1-2 0 1 1 0 0 1 2 0z"/>
|
||||||
|
</svg>
|
||||||
|
<div>
|
||||||
|
<strong>Prefix Filtering:</strong> Use the <code>Prefix</code> field to scope rules to specific paths (e.g., <code>"logs/"</code>). Leave empty to apply to all objects in the bucket.
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</article>
|
||||||
|
<article id="troubleshooting" class="card shadow-sm docs-section">
|
||||||
|
<div class="card-body">
|
||||||
|
<div class="d-flex align-items-center gap-2 mb-3">
|
||||||
|
<span class="docs-section-kicker">13</span>
|
||||||
<h2 class="h4 mb-0">Troubleshooting & tips</h2>
|
<h2 class="h4 mb-0">Troubleshooting & tips</h2>
|
||||||
</div>
|
</div>
|
||||||
<div class="table-responsive">
|
<div class="table-responsive">
|
||||||
@@ -896,6 +1017,11 @@ curl -X DELETE "{{ api_base }}/kms/keys/{key-id}?waiting_period_days=30" \
|
|||||||
<td>Proxy headers missing or <code>API_BASE_URL</code> incorrect</td>
|
<td>Proxy headers missing or <code>API_BASE_URL</code> incorrect</td>
|
||||||
<td>Ensure your proxy sends <code>X-Forwarded-Host</code>/<code>Proto</code> headers, or explicitly set <code>API_BASE_URL</code> to your public domain.</td>
|
<td>Ensure your proxy sends <code>X-Forwarded-Host</code>/<code>Proto</code> headers, or explicitly set <code>API_BASE_URL</code> to your public domain.</td>
|
||||||
</tr>
|
</tr>
|
||||||
|
<tr>
|
||||||
|
<td>Large folder uploads hitting rate limits (429)</td>
|
||||||
|
<td><code>RATE_LIMIT_DEFAULT</code> exceeded (200/min)</td>
|
||||||
|
<td>Increase rate limit in env config, use Redis backend (<code>RATE_LIMIT_STORAGE_URI=redis://host:port</code>) for distributed setups, or upload in smaller batches.</td>
|
||||||
|
</tr>
|
||||||
</tbody>
|
</tbody>
|
||||||
</table>
|
</table>
|
||||||
</div>
|
</div>
|
||||||
@@ -918,6 +1044,7 @@ curl -X DELETE "{{ api_base }}/kms/keys/{key-id}?waiting_period_days=30" \
|
|||||||
<li><a href="#versioning">Object Versioning</a></li>
|
<li><a href="#versioning">Object Versioning</a></li>
|
||||||
<li><a href="#quotas">Bucket Quotas</a></li>
|
<li><a href="#quotas">Bucket Quotas</a></li>
|
||||||
<li><a href="#encryption">Encryption</a></li>
|
<li><a href="#encryption">Encryption</a></li>
|
||||||
|
<li><a href="#lifecycle">Lifecycle Rules</a></li>
|
||||||
<li><a href="#troubleshooting">Troubleshooting</a></li>
|
<li><a href="#troubleshooting">Troubleshooting</a></li>
|
||||||
</ul>
|
</ul>
|
||||||
<div class="docs-sidebar-callouts">
|
<div class="docs-sidebar-callouts">
|
||||||
|
|||||||
@@ -116,8 +116,8 @@
|
|||||||
<div class="card h-100 iam-user-card">
|
<div class="card h-100 iam-user-card">
|
||||||
<div class="card-body">
|
<div class="card-body">
|
||||||
<div class="d-flex align-items-start justify-content-between mb-3">
|
<div class="d-flex align-items-start justify-content-between mb-3">
|
||||||
<div class="d-flex align-items-center gap-3">
|
<div class="d-flex align-items-center gap-3 min-width-0 overflow-hidden">
|
||||||
<div class="user-avatar user-avatar-lg">
|
<div class="user-avatar user-avatar-lg flex-shrink-0">
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" fill="currentColor" viewBox="0 0 16 16">
|
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" fill="currentColor" viewBox="0 0 16 16">
|
||||||
<path d="M8 8a3 3 0 1 0 0-6 3 3 0 0 0 0 6zm2-3a2 2 0 1 1-4 0 2 2 0 0 1 4 0zm4 8c0 1-1 1-1 1H3s-1 0-1-1 1-4 6-4 6 3 6 4zm-1-.004c-.001-.246-.154-.986-.832-1.664C11.516 10.68 10.289 10 8 10c-2.29 0-3.516.68-4.168 1.332-.678.678-.83 1.418-.832 1.664h10z"/>
|
<path d="M8 8a3 3 0 1 0 0-6 3 3 0 0 0 0 6zm2-3a2 2 0 1 1-4 0 2 2 0 0 1 4 0zm4 8c0 1-1 1-1 1H3s-1 0-1-1 1-4 6-4 6 3 6 4zm-1-.004c-.001-.246-.154-.986-.832-1.664C11.516 10.68 10.289 10 8 10c-2.29 0-3.516.68-4.168 1.332-.678.678-.83 1.418-.832 1.664h10z"/>
|
||||||
</svg>
|
</svg>
|
||||||
@@ -127,7 +127,7 @@
|
|||||||
<code class="small text-muted d-block text-truncate" title="{{ user.access_key }}">{{ user.access_key }}</code>
|
<code class="small text-muted d-block text-truncate" title="{{ user.access_key }}">{{ user.access_key }}</code>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div class="dropdown">
|
<div class="dropdown flex-shrink-0">
|
||||||
<button class="btn btn-sm btn-icon" type="button" data-bs-toggle="dropdown" aria-expanded="false">
|
<button class="btn btn-sm btn-icon" type="button" data-bs-toggle="dropdown" aria-expanded="false">
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" viewBox="0 0 16 16">
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" viewBox="0 0 16 16">
|
||||||
<path d="M9.5 13a1.5 1.5 0 1 1-3 0 1.5 1.5 0 0 1 3 0zm0-5a1.5 1.5 0 1 1-3 0 1.5 1.5 0 0 1 3 0zm0-5a1.5 1.5 0 1 1-3 0 1.5 1.5 0 0 1 3 0z"/>
|
<path d="M9.5 13a1.5 1.5 0 1 1-3 0 1.5 1.5 0 0 1 3 0zm0-5a1.5 1.5 0 1 1-3 0 1.5 1.5 0 0 1 3 0zm0-5a1.5 1.5 0 1 1-3 0 1.5 1.5 0 0 1 3 0z"/>
|
||||||
@@ -355,8 +355,8 @@
|
|||||||
<div class="modal-header border-0 pb-0">
|
<div class="modal-header border-0 pb-0">
|
||||||
<h1 class="modal-title fs-5 fw-semibold">
|
<h1 class="modal-title fs-5 fw-semibold">
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" fill="currentColor" class="text-danger" viewBox="0 0 16 16">
|
<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" fill="currentColor" class="text-danger" viewBox="0 0 16 16">
|
||||||
<path d="M1 14s-1 0-1-1 1-4 6-4 6 3 6 4-1 1-1 1H1zm5-6a3 3 0 1 0 0-6 3 3 0 0 0 0 6z"/>
|
<path d="M11 5a3 3 0 1 1-6 0 3 3 0 0 1 6 0M8 7a2 2 0 1 0 0-4 2 2 0 0 0 0 4m.256 7a4.5 4.5 0 0 1-.229-1.004H3c.001-.246.154-.986.832-1.664C4.484 10.68 5.711 10 8 10q.39 0 .74.025c.226-.341.496-.65.804-.918Q9.077 9.014 8 9c-5 0-6 3-6 4s1 1 1 1h5.256Z"/>
|
||||||
<path fill-rule="evenodd" d="M11 1.5v1h5v1h-1v9a2 2 0 0 1-2 2H3a2 2 0 0 1-2-2v-9H0v-1h5v-1a1 1 0 0 1 1-1h4a1 1 0 0 1 1 1zM4.118 4 4 4.059V13a1 1 0 0 0 1 1h6a1 1 0 0 0 1-1V4.059L11.882 4H4.118z"/>
|
<path d="M12.5 16a3.5 3.5 0 1 0 0-7 3.5 3.5 0 0 0 0 7m-.646-4.854.646.647.646-.647a.5.5 0 0 1 .708.708l-.647.646.647.646a.5.5 0 0 1-.708.708l-.646-.647-.646.647a.5.5 0 0 1-.708-.708l.647-.646-.647-.646a.5.5 0 0 1 .708-.708"/>
|
||||||
</svg>
|
</svg>
|
||||||
Delete User
|
Delete User
|
||||||
</h1>
|
</h1>
|
||||||
@@ -454,339 +454,20 @@
|
|||||||
|
|
||||||
{% block extra_scripts %}
|
{% block extra_scripts %}
|
||||||
{{ super() }}
|
{{ super() }}
|
||||||
|
<script src="{{ url_for('static', filename='js/iam-management.js') }}"></script>
|
||||||
<script>
|
<script>
|
||||||
(function () {
|
IAMManagement.init({
|
||||||
function setupJsonAutoIndent(textarea) {
|
users: JSON.parse(document.getElementById('iamUsersJson').textContent || '[]'),
|
||||||
if (!textarea) return;
|
currentUserKey: {{ principal.access_key | tojson }},
|
||||||
|
iamLocked: {{ iam_locked | tojson }},
|
||||||
textarea.addEventListener('keydown', function(e) {
|
csrfToken: "{{ csrf_token() }}",
|
||||||
if (e.key === 'Enter') {
|
endpoints: {
|
||||||
e.preventDefault();
|
createUser: "{{ url_for('ui.create_iam_user') }}",
|
||||||
|
updateUser: "{{ url_for('ui.update_iam_user', access_key='ACCESS_KEY') }}",
|
||||||
const start = this.selectionStart;
|
deleteUser: "{{ url_for('ui.delete_iam_user', access_key='ACCESS_KEY') }}",
|
||||||
const end = this.selectionEnd;
|
updatePolicies: "{{ url_for('ui.update_iam_policies', access_key='ACCESS_KEY') }}",
|
||||||
const value = this.value;
|
rotateSecret: "{{ url_for('ui.rotate_iam_secret', access_key='ACCESS_KEY') }}"
|
||||||
|
|
||||||
const lineStart = value.lastIndexOf('\n', start - 1) + 1;
|
|
||||||
const currentLine = value.substring(lineStart, start);
|
|
||||||
|
|
||||||
const indentMatch = currentLine.match(/^(\s*)/);
|
|
||||||
let indent = indentMatch ? indentMatch[1] : '';
|
|
||||||
|
|
||||||
const trimmedLine = currentLine.trim();
|
|
||||||
const lastChar = trimmedLine.slice(-1);
|
|
||||||
|
|
||||||
const charBeforeCursor = value.substring(start - 1, start).trim();
|
|
||||||
|
|
||||||
let newIndent = indent;
|
|
||||||
let insertAfter = '';
|
|
||||||
|
|
||||||
if (lastChar === '{' || lastChar === '[') {
|
|
||||||
newIndent = indent + ' ';
|
|
||||||
|
|
||||||
const charAfterCursor = value.substring(start, start + 1).trim();
|
|
||||||
if ((lastChar === '{' && charAfterCursor === '}') ||
|
|
||||||
(lastChar === '[' && charAfterCursor === ']')) {
|
|
||||||
insertAfter = '\n' + indent;
|
|
||||||
}
|
|
||||||
} else if (lastChar === ',' || lastChar === ':') {
|
|
||||||
newIndent = indent;
|
|
||||||
}
|
|
||||||
|
|
||||||
const insertion = '\n' + newIndent + insertAfter;
|
|
||||||
const newValue = value.substring(0, start) + insertion + value.substring(end);
|
|
||||||
|
|
||||||
this.value = newValue;
|
|
||||||
|
|
||||||
const newCursorPos = start + 1 + newIndent.length;
|
|
||||||
this.selectionStart = this.selectionEnd = newCursorPos;
|
|
||||||
|
|
||||||
this.dispatchEvent(new Event('input', { bubbles: true }));
|
|
||||||
}
|
|
||||||
|
|
||||||
if (e.key === 'Tab') {
|
|
||||||
e.preventDefault();
|
|
||||||
const start = this.selectionStart;
|
|
||||||
const end = this.selectionEnd;
|
|
||||||
|
|
||||||
if (e.shiftKey) {
|
|
||||||
const lineStart = this.value.lastIndexOf('\n', start - 1) + 1;
|
|
||||||
const lineContent = this.value.substring(lineStart, start);
|
|
||||||
if (lineContent.startsWith(' ')) {
|
|
||||||
this.value = this.value.substring(0, lineStart) +
|
|
||||||
this.value.substring(lineStart + 2);
|
|
||||||
this.selectionStart = this.selectionEnd = Math.max(lineStart, start - 2);
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
this.value = this.value.substring(0, start) + ' ' + this.value.substring(end);
|
|
||||||
this.selectionStart = this.selectionEnd = start + 2;
|
|
||||||
}
|
|
||||||
|
|
||||||
this.dispatchEvent(new Event('input', { bubbles: true }));
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
}
|
||||||
|
});
|
||||||
setupJsonAutoIndent(document.getElementById('policyEditorDocument'));
|
|
||||||
setupJsonAutoIndent(document.getElementById('createUserPolicies'));
|
|
||||||
|
|
||||||
const currentUserKey = {{ principal.access_key | tojson }};
|
|
||||||
const configCopyButtons = document.querySelectorAll('.config-copy');
|
|
||||||
configCopyButtons.forEach((button) => {
|
|
||||||
button.addEventListener('click', async () => {
|
|
||||||
const targetId = button.dataset.copyTarget;
|
|
||||||
const target = document.getElementById(targetId);
|
|
||||||
if (!target) return;
|
|
||||||
const text = target.innerText;
|
|
||||||
try {
|
|
||||||
await navigator.clipboard.writeText(text);
|
|
||||||
button.textContent = 'Copied!';
|
|
||||||
setTimeout(() => {
|
|
||||||
button.textContent = 'Copy JSON';
|
|
||||||
}, 1500);
|
|
||||||
} catch (err) {
|
|
||||||
console.error('Unable to copy IAM config', err);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
const secretCopyButton = document.querySelector('[data-secret-copy]');
|
|
||||||
if (secretCopyButton) {
|
|
||||||
secretCopyButton.addEventListener('click', async () => {
|
|
||||||
const secretInput = document.getElementById('disclosedSecretValue');
|
|
||||||
if (!secretInput) return;
|
|
||||||
try {
|
|
||||||
await navigator.clipboard.writeText(secretInput.value);
|
|
||||||
secretCopyButton.textContent = 'Copied!';
|
|
||||||
setTimeout(() => {
|
|
||||||
secretCopyButton.textContent = 'Copy';
|
|
||||||
}, 1500);
|
|
||||||
} catch (err) {
|
|
||||||
console.error('Unable to copy IAM secret', err);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
const iamUsersData = document.getElementById('iamUsersJson');
|
|
||||||
const users = iamUsersData ? JSON.parse(iamUsersData.textContent || '[]') : [];
|
|
||||||
|
|
||||||
const policyModalEl = document.getElementById('policyEditorModal');
|
|
||||||
const policyModal = new bootstrap.Modal(policyModalEl);
|
|
||||||
const userLabelEl = document.getElementById('policyEditorUserLabel');
|
|
||||||
const userInputEl = document.getElementById('policyEditorUser');
|
|
||||||
const textareaEl = document.getElementById('policyEditorDocument');
|
|
||||||
const formEl = document.getElementById('policyEditorForm');
|
|
||||||
const templateButtons = document.querySelectorAll('[data-policy-template]');
|
|
||||||
const iamLocked = {{ iam_locked | tojson }};
|
|
||||||
|
|
||||||
if (iamLocked) return;
|
|
||||||
|
|
||||||
const userPolicies = (accessKey) => {
|
|
||||||
const target = users.find((user) => user.access_key === accessKey);
|
|
||||||
return target ? JSON.stringify(target.policies, null, 2) : '';
|
|
||||||
};
|
|
||||||
|
|
||||||
const applyTemplate = (name) => {
|
|
||||||
const templates = {
|
|
||||||
full: [
|
|
||||||
{
|
|
||||||
bucket: '*',
|
|
||||||
actions: ['list', 'read', 'write', 'delete', 'share', 'policy', 'replication', 'iam:list_users', 'iam:*'],
|
|
||||||
},
|
|
||||||
],
|
|
||||||
readonly: [
|
|
||||||
{
|
|
||||||
bucket: '*',
|
|
||||||
actions: ['list', 'read'],
|
|
||||||
},
|
|
||||||
],
|
|
||||||
writer: [
|
|
||||||
{
|
|
||||||
bucket: '*',
|
|
||||||
actions: ['list', 'read', 'write'],
|
|
||||||
},
|
|
||||||
],
|
|
||||||
};
|
|
||||||
if (templates[name]) {
|
|
||||||
textareaEl.value = JSON.stringify(templates[name], null, 2);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
templateButtons.forEach((button) => {
|
|
||||||
button.addEventListener('click', () => applyTemplate(button.dataset.policyTemplate));
|
|
||||||
});
|
|
||||||
|
|
||||||
const createUserPoliciesEl = document.getElementById('createUserPolicies');
|
|
||||||
const createTemplateButtons = document.querySelectorAll('[data-create-policy-template]');
|
|
||||||
|
|
||||||
const applyCreateTemplate = (name) => {
|
|
||||||
const templates = {
|
|
||||||
full: [
|
|
||||||
{
|
|
||||||
bucket: '*',
|
|
||||||
actions: ['list', 'read', 'write', 'delete', 'share', 'policy', 'replication', 'iam:list_users', 'iam:*'],
|
|
||||||
},
|
|
||||||
],
|
|
||||||
readonly: [
|
|
||||||
{
|
|
||||||
bucket: '*',
|
|
||||||
actions: ['list', 'read'],
|
|
||||||
},
|
|
||||||
],
|
|
||||||
writer: [
|
|
||||||
{
|
|
||||||
bucket: '*',
|
|
||||||
actions: ['list', 'read', 'write'],
|
|
||||||
},
|
|
||||||
],
|
|
||||||
};
|
|
||||||
if (templates[name] && createUserPoliciesEl) {
|
|
||||||
createUserPoliciesEl.value = JSON.stringify(templates[name], null, 2);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
createTemplateButtons.forEach((button) => {
|
|
||||||
button.addEventListener('click', () => applyCreateTemplate(button.dataset.createPolicyTemplate));
|
|
||||||
});
|
|
||||||
|
|
||||||
formEl?.addEventListener('submit', (event) => {
|
|
||||||
const key = userInputEl.value;
|
|
||||||
if (!key) {
|
|
||||||
event.preventDefault();
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
const template = formEl.dataset.actionTemplate;
|
|
||||||
formEl.action = template.replace('ACCESS_KEY_PLACEHOLDER', key);
|
|
||||||
});
|
|
||||||
|
|
||||||
document.querySelectorAll('[data-policy-editor]').forEach((button) => {
|
|
||||||
button.addEventListener('click', () => {
|
|
||||||
const key = button.getAttribute('data-access-key');
|
|
||||||
if (!key) return;
|
|
||||||
|
|
||||||
userLabelEl.textContent = key;
|
|
||||||
userInputEl.value = key;
|
|
||||||
textareaEl.value = userPolicies(key);
|
|
||||||
|
|
||||||
policyModal.show();
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
const editUserModal = new bootstrap.Modal(document.getElementById('editUserModal'));
|
|
||||||
const editUserForm = document.getElementById('editUserForm');
|
|
||||||
const editUserDisplayName = document.getElementById('editUserDisplayName');
|
|
||||||
|
|
||||||
document.querySelectorAll('[data-edit-user]').forEach(btn => {
|
|
||||||
btn.addEventListener('click', () => {
|
|
||||||
const key = btn.dataset.editUser;
|
|
||||||
const name = btn.dataset.displayName;
|
|
||||||
editUserDisplayName.value = name;
|
|
||||||
editUserForm.action = "{{ url_for('ui.update_iam_user', access_key='ACCESS_KEY') }}".replace('ACCESS_KEY', key);
|
|
||||||
editUserModal.show();
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
const deleteUserModal = new bootstrap.Modal(document.getElementById('deleteUserModal'));
|
|
||||||
const deleteUserForm = document.getElementById('deleteUserForm');
|
|
||||||
const deleteUserLabel = document.getElementById('deleteUserLabel');
|
|
||||||
const deleteSelfWarning = document.getElementById('deleteSelfWarning');
|
|
||||||
|
|
||||||
document.querySelectorAll('[data-delete-user]').forEach(btn => {
|
|
||||||
btn.addEventListener('click', () => {
|
|
||||||
const key = btn.dataset.deleteUser;
|
|
||||||
deleteUserLabel.textContent = key;
|
|
||||||
deleteUserForm.action = "{{ url_for('ui.delete_iam_user', access_key='ACCESS_KEY') }}".replace('ACCESS_KEY', key);
|
|
||||||
|
|
||||||
if (key === currentUserKey) {
|
|
||||||
deleteSelfWarning.classList.remove('d-none');
|
|
||||||
} else {
|
|
||||||
deleteSelfWarning.classList.add('d-none');
|
|
||||||
}
|
|
||||||
|
|
||||||
deleteUserModal.show();
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
const rotateSecretModal = new bootstrap.Modal(document.getElementById('rotateSecretModal'));
|
|
||||||
const rotateUserLabel = document.getElementById('rotateUserLabel');
|
|
||||||
const confirmRotateBtn = document.getElementById('confirmRotateBtn');
|
|
||||||
const rotateCancelBtn = document.getElementById('rotateCancelBtn');
|
|
||||||
const rotateDoneBtn = document.getElementById('rotateDoneBtn');
|
|
||||||
const rotateSecretConfirm = document.getElementById('rotateSecretConfirm');
|
|
||||||
const rotateSecretResult = document.getElementById('rotateSecretResult');
|
|
||||||
const newSecretKeyInput = document.getElementById('newSecretKey');
|
|
||||||
const copyNewSecretBtn = document.getElementById('copyNewSecret');
|
|
||||||
let currentRotateKey = null;
|
|
||||||
|
|
||||||
document.querySelectorAll('[data-rotate-user]').forEach(btn => {
|
|
||||||
btn.addEventListener('click', () => {
|
|
||||||
currentRotateKey = btn.dataset.rotateUser;
|
|
||||||
rotateUserLabel.textContent = currentRotateKey;
|
|
||||||
|
|
||||||
rotateSecretConfirm.classList.remove('d-none');
|
|
||||||
rotateSecretResult.classList.add('d-none');
|
|
||||||
confirmRotateBtn.classList.remove('d-none');
|
|
||||||
rotateCancelBtn.classList.remove('d-none');
|
|
||||||
rotateDoneBtn.classList.add('d-none');
|
|
||||||
|
|
||||||
rotateSecretModal.show();
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
confirmRotateBtn.addEventListener('click', async () => {
|
|
||||||
if (!currentRotateKey) return;
|
|
||||||
|
|
||||||
confirmRotateBtn.disabled = true;
|
|
||||||
confirmRotateBtn.textContent = "Rotating...";
|
|
||||||
|
|
||||||
try {
|
|
||||||
const url = "{{ url_for('ui.rotate_iam_secret', access_key='ACCESS_KEY') }}".replace('ACCESS_KEY', currentRotateKey);
|
|
||||||
const response = await fetch(url, {
|
|
||||||
method: 'POST',
|
|
||||||
headers: {
|
|
||||||
'Accept': 'application/json',
|
|
||||||
'X-CSRFToken': "{{ csrf_token() }}"
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!response.ok) {
|
|
||||||
const data = await response.json();
|
|
||||||
throw new Error(data.error || 'Failed to rotate secret');
|
|
||||||
}
|
|
||||||
|
|
||||||
const data = await response.json();
|
|
||||||
newSecretKeyInput.value = data.secret_key;
|
|
||||||
|
|
||||||
rotateSecretConfirm.classList.add('d-none');
|
|
||||||
rotateSecretResult.classList.remove('d-none');
|
|
||||||
confirmRotateBtn.classList.add('d-none');
|
|
||||||
rotateCancelBtn.classList.add('d-none');
|
|
||||||
rotateDoneBtn.classList.remove('d-none');
|
|
||||||
|
|
||||||
} catch (err) {
|
|
||||||
if (window.showToast) {
|
|
||||||
window.showToast(err.message, 'Error', 'danger');
|
|
||||||
}
|
|
||||||
rotateSecretModal.hide();
|
|
||||||
} finally {
|
|
||||||
confirmRotateBtn.disabled = false;
|
|
||||||
confirmRotateBtn.textContent = "Rotate Key";
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
copyNewSecretBtn.addEventListener('click', async () => {
|
|
||||||
try {
|
|
||||||
await navigator.clipboard.writeText(newSecretKeyInput.value);
|
|
||||||
copyNewSecretBtn.textContent = 'Copied!';
|
|
||||||
setTimeout(() => copyNewSecretBtn.textContent = 'Copy', 1500);
|
|
||||||
} catch (err) {
|
|
||||||
console.error('Failed to copy', err);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
rotateDoneBtn.addEventListener('click', () => {
|
|
||||||
window.location.reload();
|
|
||||||
});
|
|
||||||
})();
|
|
||||||
</script>
|
</script>
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|||||||
@@ -6,11 +6,11 @@
|
|||||||
<p class="text-muted mb-0">Real-time server performance and storage usage</p>
|
<p class="text-muted mb-0">Real-time server performance and storage usage</p>
|
||||||
</div>
|
</div>
|
||||||
<div class="d-flex gap-2 align-items-center">
|
<div class="d-flex gap-2 align-items-center">
|
||||||
<span class="d-flex align-items-center gap-2 text-muted small">
|
<span class="d-flex align-items-center gap-2 text-muted small" id="metricsLiveIndicator">
|
||||||
<span class="live-indicator"></span>
|
<span class="live-indicator"></span>
|
||||||
Live
|
Auto-refresh: <span id="refreshCountdown">5</span>s
|
||||||
</span>
|
</span>
|
||||||
<button class="btn btn-outline-secondary btn-sm" onclick="window.location.reload()">
|
<button class="btn btn-outline-secondary btn-sm" id="refreshMetricsBtn">
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="14" height="14" fill="currentColor" class="bi bi-arrow-clockwise me-1" viewBox="0 0 16 16">
|
<svg xmlns="http://www.w3.org/2000/svg" width="14" height="14" fill="currentColor" class="bi bi-arrow-clockwise me-1" viewBox="0 0 16 16">
|
||||||
<path fill-rule="evenodd" d="M8 3a5 5 0 1 0 4.546 2.914.5.5 0 0 1 .908-.417A6 6 0 1 1 8 2v1z"/>
|
<path fill-rule="evenodd" d="M8 3a5 5 0 1 0 4.546 2.914.5.5 0 0 1 .908-.417A6 6 0 1 1 8 2v1z"/>
|
||||||
<path d="M8 4.466V.534a.25.25 0 0 1 .41-.192l2.36 1.966c.12.1.12.284 0 .384L8.41 4.658A.25.25 0 0 1 8 4.466z"/>
|
<path d="M8 4.466V.534a.25.25 0 0 1 .41-.192l2.36 1.966c.12.1.12.284 0 .384L8.41 4.658A.25.25 0 0 1 8 4.466z"/>
|
||||||
@@ -32,15 +32,13 @@
|
|||||||
</svg>
|
</svg>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<h2 class="display-6 fw-bold mb-2 stat-value">{{ cpu_percent }}<span class="fs-4 fw-normal text-muted">%</span></h2>
|
<h2 class="display-6 fw-bold mb-2 stat-value"><span data-metric="cpu_percent">{{ cpu_percent }}</span><span class="fs-4 fw-normal text-muted">%</span></h2>
|
||||||
<div class="progress" style="height: 8px; border-radius: 4px;">
|
<div class="progress" style="height: 8px; border-radius: 4px;">
|
||||||
<div class="progress-bar {% if cpu_percent > 80 %}bg-danger{% elif cpu_percent > 50 %}bg-warning{% else %}bg-primary{% endif %}" role="progressbar" style="width: {{ cpu_percent }}%"></div>
|
<div class="progress-bar bg-primary" data-metric="cpu_bar" role="progressbar" style="width: {{ cpu_percent }}%"></div>
|
||||||
</div>
|
</div>
|
||||||
<div class="mt-2 d-flex justify-content-between">
|
<div class="mt-2 d-flex justify-content-between">
|
||||||
<small class="text-muted">Current load</small>
|
<small class="text-muted">Current load</small>
|
||||||
<small class="{% if cpu_percent > 80 %}text-danger{% elif cpu_percent > 50 %}text-warning{% else %}text-success{% endif %}">
|
<small data-metric="cpu_status" class="text-success">Normal</small>
|
||||||
{% if cpu_percent > 80 %}High{% elif cpu_percent > 50 %}Medium{% else %}Normal{% endif %}
|
|
||||||
</small>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@@ -57,13 +55,13 @@
|
|||||||
</svg>
|
</svg>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<h2 class="display-6 fw-bold mb-2 stat-value">{{ memory.percent }}<span class="fs-4 fw-normal text-muted">%</span></h2>
|
<h2 class="display-6 fw-bold mb-2 stat-value"><span data-metric="memory_percent">{{ memory.percent }}</span><span class="fs-4 fw-normal text-muted">%</span></h2>
|
||||||
<div class="progress" style="height: 8px; border-radius: 4px;">
|
<div class="progress" style="height: 8px; border-radius: 4px;">
|
||||||
<div class="progress-bar bg-info" role="progressbar" style="width: {{ memory.percent }}%"></div>
|
<div class="progress-bar bg-info" data-metric="memory_bar" role="progressbar" style="width: {{ memory.percent }}%"></div>
|
||||||
</div>
|
</div>
|
||||||
<div class="mt-2 d-flex justify-content-between">
|
<div class="mt-2 d-flex justify-content-between">
|
||||||
<small class="text-muted">{{ memory.used }} used</small>
|
<small class="text-muted"><span data-metric="memory_used">{{ memory.used }}</span> used</small>
|
||||||
<small class="text-muted">{{ memory.total }} total</small>
|
<small class="text-muted"><span data-metric="memory_total">{{ memory.total }}</span> total</small>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@@ -81,13 +79,13 @@
|
|||||||
</svg>
|
</svg>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<h2 class="display-6 fw-bold mb-2 stat-value">{{ disk.percent }}<span class="fs-4 fw-normal text-muted">%</span></h2>
|
<h2 class="display-6 fw-bold mb-2 stat-value"><span data-metric="disk_percent">{{ disk.percent }}</span><span class="fs-4 fw-normal text-muted">%</span></h2>
|
||||||
<div class="progress" style="height: 8px; border-radius: 4px;">
|
<div class="progress" style="height: 8px; border-radius: 4px;">
|
||||||
<div class="progress-bar {% if disk.percent > 90 %}bg-danger{% elif disk.percent > 75 %}bg-warning{% else %}bg-warning{% endif %}" role="progressbar" style="width: {{ disk.percent }}%"></div>
|
<div class="progress-bar bg-warning" data-metric="disk_bar" role="progressbar" style="width: {{ disk.percent }}%"></div>
|
||||||
</div>
|
</div>
|
||||||
<div class="mt-2 d-flex justify-content-between">
|
<div class="mt-2 d-flex justify-content-between">
|
||||||
<small class="text-muted">{{ disk.free }} free</small>
|
<small class="text-muted"><span data-metric="disk_free">{{ disk.free }}</span> free</small>
|
||||||
<small class="text-muted">{{ disk.total }} total</small>
|
<small class="text-muted"><span data-metric="disk_total">{{ disk.total }}</span> total</small>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@@ -104,15 +102,15 @@
|
|||||||
</svg>
|
</svg>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<h2 class="display-6 fw-bold mb-2 stat-value">{{ app.storage_used }}</h2>
|
<h2 class="display-6 fw-bold mb-2 stat-value" data-metric="storage_used">{{ app.storage_used }}</h2>
|
||||||
<div class="d-flex gap-3 mt-3">
|
<div class="d-flex gap-3 mt-3">
|
||||||
<div class="text-center flex-fill">
|
<div class="text-center flex-fill">
|
||||||
<div class="h5 fw-bold mb-0">{{ app.buckets }}</div>
|
<div class="h5 fw-bold mb-0" data-metric="buckets_count">{{ app.buckets }}</div>
|
||||||
<small class="text-muted">Buckets</small>
|
<small class="text-muted">Buckets</small>
|
||||||
</div>
|
</div>
|
||||||
<div class="vr"></div>
|
<div class="vr"></div>
|
||||||
<div class="text-center flex-fill">
|
<div class="text-center flex-fill">
|
||||||
<div class="h5 fw-bold mb-0">{{ app.objects }}</div>
|
<div class="h5 fw-bold mb-0" data-metric="objects_count">{{ app.objects }}</div>
|
||||||
<small class="text-muted">Objects</small>
|
<small class="text-muted">Objects</small>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@@ -270,3 +268,109 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|
||||||
|
{% block extra_scripts %}
|
||||||
|
<script>
|
||||||
|
(function() {
|
||||||
|
var refreshInterval = 5000;
|
||||||
|
var countdown = 5;
|
||||||
|
var countdownEl = document.getElementById('refreshCountdown');
|
||||||
|
var refreshBtn = document.getElementById('refreshMetricsBtn');
|
||||||
|
var countdownTimer = null;
|
||||||
|
var fetchTimer = null;
|
||||||
|
|
||||||
|
function updateMetrics() {
|
||||||
|
fetch('/ui/metrics/api')
|
||||||
|
.then(function(resp) { return resp.json(); })
|
||||||
|
.then(function(data) {
|
||||||
|
var el;
|
||||||
|
el = document.querySelector('[data-metric="cpu_percent"]');
|
||||||
|
if (el) el.textContent = data.cpu_percent;
|
||||||
|
el = document.querySelector('[data-metric="cpu_bar"]');
|
||||||
|
if (el) {
|
||||||
|
el.style.width = data.cpu_percent + '%';
|
||||||
|
el.className = 'progress-bar ' + (data.cpu_percent > 80 ? 'bg-danger' : data.cpu_percent > 50 ? 'bg-warning' : 'bg-primary');
|
||||||
|
}
|
||||||
|
el = document.querySelector('[data-metric="cpu_status"]');
|
||||||
|
if (el) {
|
||||||
|
el.textContent = data.cpu_percent > 80 ? 'High' : data.cpu_percent > 50 ? 'Medium' : 'Normal';
|
||||||
|
el.className = data.cpu_percent > 80 ? 'text-danger' : data.cpu_percent > 50 ? 'text-warning' : 'text-success';
|
||||||
|
}
|
||||||
|
|
||||||
|
el = document.querySelector('[data-metric="memory_percent"]');
|
||||||
|
if (el) el.textContent = data.memory.percent;
|
||||||
|
el = document.querySelector('[data-metric="memory_bar"]');
|
||||||
|
if (el) el.style.width = data.memory.percent + '%';
|
||||||
|
el = document.querySelector('[data-metric="memory_used"]');
|
||||||
|
if (el) el.textContent = data.memory.used;
|
||||||
|
el = document.querySelector('[data-metric="memory_total"]');
|
||||||
|
if (el) el.textContent = data.memory.total;
|
||||||
|
|
||||||
|
el = document.querySelector('[data-metric="disk_percent"]');
|
||||||
|
if (el) el.textContent = data.disk.percent;
|
||||||
|
el = document.querySelector('[data-metric="disk_bar"]');
|
||||||
|
if (el) {
|
||||||
|
el.style.width = data.disk.percent + '%';
|
||||||
|
el.className = 'progress-bar ' + (data.disk.percent > 90 ? 'bg-danger' : 'bg-warning');
|
||||||
|
}
|
||||||
|
el = document.querySelector('[data-metric="disk_free"]');
|
||||||
|
if (el) el.textContent = data.disk.free;
|
||||||
|
el = document.querySelector('[data-metric="disk_total"]');
|
||||||
|
if (el) el.textContent = data.disk.total;
|
||||||
|
|
||||||
|
el = document.querySelector('[data-metric="storage_used"]');
|
||||||
|
if (el) el.textContent = data.app.storage_used;
|
||||||
|
el = document.querySelector('[data-metric="buckets_count"]');
|
||||||
|
if (el) el.textContent = data.app.buckets;
|
||||||
|
el = document.querySelector('[data-metric="objects_count"]');
|
||||||
|
if (el) el.textContent = data.app.objects;
|
||||||
|
|
||||||
|
countdown = 5;
|
||||||
|
})
|
||||||
|
.catch(function(err) {
|
||||||
|
console.error('Metrics fetch error:', err);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function startCountdown() {
|
||||||
|
if (countdownTimer) clearInterval(countdownTimer);
|
||||||
|
countdown = 5;
|
||||||
|
if (countdownEl) countdownEl.textContent = countdown;
|
||||||
|
countdownTimer = setInterval(function() {
|
||||||
|
countdown--;
|
||||||
|
if (countdownEl) countdownEl.textContent = countdown;
|
||||||
|
if (countdown <= 0) {
|
||||||
|
countdown = 5;
|
||||||
|
}
|
||||||
|
}, 1000);
|
||||||
|
}
|
||||||
|
|
||||||
|
function startPolling() {
|
||||||
|
if (fetchTimer) clearInterval(fetchTimer);
|
||||||
|
fetchTimer = setInterval(function() {
|
||||||
|
if (!document.hidden) {
|
||||||
|
updateMetrics();
|
||||||
|
}
|
||||||
|
}, refreshInterval);
|
||||||
|
startCountdown();
|
||||||
|
}
|
||||||
|
|
||||||
|
if (refreshBtn) {
|
||||||
|
refreshBtn.addEventListener('click', function() {
|
||||||
|
updateMetrics();
|
||||||
|
countdown = 5;
|
||||||
|
if (countdownEl) countdownEl.textContent = countdown;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
document.addEventListener('visibilitychange', function() {
|
||||||
|
if (!document.hidden) {
|
||||||
|
updateMetrics();
|
||||||
|
startPolling();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
startPolling();
|
||||||
|
})();
|
||||||
|
</script>
|
||||||
|
{% endblock %}
|
||||||
|
|||||||
339
tests/test_access_logging.py
Normal file
339
tests/test_access_logging.py
Normal file
@@ -0,0 +1,339 @@
|
|||||||
|
import io
|
||||||
|
import json
|
||||||
|
import time
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
from pathlib import Path
|
||||||
|
from unittest.mock import MagicMock, patch
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from app.access_logging import (
|
||||||
|
AccessLogEntry,
|
||||||
|
AccessLoggingService,
|
||||||
|
LoggingConfiguration,
|
||||||
|
)
|
||||||
|
from app.storage import ObjectStorage
|
||||||
|
|
||||||
|
|
||||||
|
class TestAccessLogEntry:
|
||||||
|
def test_default_values(self):
|
||||||
|
entry = AccessLogEntry()
|
||||||
|
assert entry.bucket_owner == "-"
|
||||||
|
assert entry.bucket == "-"
|
||||||
|
assert entry.remote_ip == "-"
|
||||||
|
assert entry.requester == "-"
|
||||||
|
assert entry.operation == "-"
|
||||||
|
assert entry.http_status == 200
|
||||||
|
assert len(entry.request_id) == 16
|
||||||
|
|
||||||
|
def test_to_log_line(self):
|
||||||
|
entry = AccessLogEntry(
|
||||||
|
bucket_owner="owner123",
|
||||||
|
bucket="my-bucket",
|
||||||
|
remote_ip="192.168.1.1",
|
||||||
|
requester="user456",
|
||||||
|
request_id="REQ123456789012",
|
||||||
|
operation="REST.PUT.OBJECT",
|
||||||
|
key="test/key.txt",
|
||||||
|
request_uri="PUT /my-bucket/test/key.txt HTTP/1.1",
|
||||||
|
http_status=200,
|
||||||
|
bytes_sent=1024,
|
||||||
|
object_size=2048,
|
||||||
|
total_time_ms=150,
|
||||||
|
referrer="http://example.com",
|
||||||
|
user_agent="aws-cli/2.0",
|
||||||
|
version_id="v1",
|
||||||
|
)
|
||||||
|
log_line = entry.to_log_line()
|
||||||
|
|
||||||
|
assert "owner123" in log_line
|
||||||
|
assert "my-bucket" in log_line
|
||||||
|
assert "192.168.1.1" in log_line
|
||||||
|
assert "user456" in log_line
|
||||||
|
assert "REST.PUT.OBJECT" in log_line
|
||||||
|
assert "test/key.txt" in log_line
|
||||||
|
assert "200" in log_line
|
||||||
|
|
||||||
|
def test_to_dict(self):
|
||||||
|
entry = AccessLogEntry(
|
||||||
|
bucket_owner="owner",
|
||||||
|
bucket="bucket",
|
||||||
|
remote_ip="10.0.0.1",
|
||||||
|
requester="admin",
|
||||||
|
request_id="ABC123",
|
||||||
|
operation="REST.GET.OBJECT",
|
||||||
|
key="file.txt",
|
||||||
|
request_uri="GET /bucket/file.txt HTTP/1.1",
|
||||||
|
http_status=200,
|
||||||
|
bytes_sent=512,
|
||||||
|
object_size=512,
|
||||||
|
total_time_ms=50,
|
||||||
|
)
|
||||||
|
result = entry.to_dict()
|
||||||
|
|
||||||
|
assert result["bucket_owner"] == "owner"
|
||||||
|
assert result["bucket"] == "bucket"
|
||||||
|
assert result["remote_ip"] == "10.0.0.1"
|
||||||
|
assert result["requester"] == "admin"
|
||||||
|
assert result["operation"] == "REST.GET.OBJECT"
|
||||||
|
assert result["key"] == "file.txt"
|
||||||
|
assert result["http_status"] == 200
|
||||||
|
assert result["bytes_sent"] == 512
|
||||||
|
|
||||||
|
|
||||||
|
class TestLoggingConfiguration:
|
||||||
|
def test_default_values(self):
|
||||||
|
config = LoggingConfiguration(target_bucket="log-bucket")
|
||||||
|
assert config.target_bucket == "log-bucket"
|
||||||
|
assert config.target_prefix == ""
|
||||||
|
assert config.enabled is True
|
||||||
|
|
||||||
|
def test_to_dict(self):
|
||||||
|
config = LoggingConfiguration(
|
||||||
|
target_bucket="logs",
|
||||||
|
target_prefix="access-logs/",
|
||||||
|
enabled=True,
|
||||||
|
)
|
||||||
|
result = config.to_dict()
|
||||||
|
|
||||||
|
assert "LoggingEnabled" in result
|
||||||
|
assert result["LoggingEnabled"]["TargetBucket"] == "logs"
|
||||||
|
assert result["LoggingEnabled"]["TargetPrefix"] == "access-logs/"
|
||||||
|
|
||||||
|
def test_from_dict(self):
|
||||||
|
data = {
|
||||||
|
"LoggingEnabled": {
|
||||||
|
"TargetBucket": "my-logs",
|
||||||
|
"TargetPrefix": "bucket-logs/",
|
||||||
|
}
|
||||||
|
}
|
||||||
|
config = LoggingConfiguration.from_dict(data)
|
||||||
|
|
||||||
|
assert config is not None
|
||||||
|
assert config.target_bucket == "my-logs"
|
||||||
|
assert config.target_prefix == "bucket-logs/"
|
||||||
|
assert config.enabled is True
|
||||||
|
|
||||||
|
def test_from_dict_no_logging(self):
|
||||||
|
data = {}
|
||||||
|
config = LoggingConfiguration.from_dict(data)
|
||||||
|
assert config is None
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def storage(tmp_path: Path):
|
||||||
|
storage_root = tmp_path / "data"
|
||||||
|
storage_root.mkdir(parents=True)
|
||||||
|
return ObjectStorage(storage_root)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def logging_service(tmp_path: Path, storage):
|
||||||
|
service = AccessLoggingService(
|
||||||
|
tmp_path,
|
||||||
|
flush_interval=3600,
|
||||||
|
max_buffer_size=10,
|
||||||
|
)
|
||||||
|
service.set_storage(storage)
|
||||||
|
yield service
|
||||||
|
service.shutdown()
|
||||||
|
|
||||||
|
|
||||||
|
class TestAccessLoggingService:
|
||||||
|
def test_get_bucket_logging_not_configured(self, logging_service):
|
||||||
|
result = logging_service.get_bucket_logging("unconfigured-bucket")
|
||||||
|
assert result is None
|
||||||
|
|
||||||
|
def test_set_and_get_bucket_logging(self, logging_service):
|
||||||
|
config = LoggingConfiguration(
|
||||||
|
target_bucket="log-bucket",
|
||||||
|
target_prefix="logs/",
|
||||||
|
)
|
||||||
|
logging_service.set_bucket_logging("source-bucket", config)
|
||||||
|
|
||||||
|
retrieved = logging_service.get_bucket_logging("source-bucket")
|
||||||
|
assert retrieved is not None
|
||||||
|
assert retrieved.target_bucket == "log-bucket"
|
||||||
|
assert retrieved.target_prefix == "logs/"
|
||||||
|
|
||||||
|
def test_delete_bucket_logging(self, logging_service):
|
||||||
|
config = LoggingConfiguration(target_bucket="logs")
|
||||||
|
logging_service.set_bucket_logging("to-delete", config)
|
||||||
|
assert logging_service.get_bucket_logging("to-delete") is not None
|
||||||
|
|
||||||
|
logging_service.delete_bucket_logging("to-delete")
|
||||||
|
logging_service._configs.clear()
|
||||||
|
assert logging_service.get_bucket_logging("to-delete") is None
|
||||||
|
|
||||||
|
def test_log_request_no_config(self, logging_service):
|
||||||
|
logging_service.log_request(
|
||||||
|
"no-config-bucket",
|
||||||
|
operation="REST.GET.OBJECT",
|
||||||
|
key="test.txt",
|
||||||
|
)
|
||||||
|
stats = logging_service.get_stats()
|
||||||
|
assert stats["buffered_entries"] == 0
|
||||||
|
|
||||||
|
def test_log_request_with_config(self, logging_service, storage):
|
||||||
|
storage.create_bucket("log-target")
|
||||||
|
|
||||||
|
config = LoggingConfiguration(
|
||||||
|
target_bucket="log-target",
|
||||||
|
target_prefix="access/",
|
||||||
|
)
|
||||||
|
logging_service.set_bucket_logging("source-bucket", config)
|
||||||
|
|
||||||
|
logging_service.log_request(
|
||||||
|
"source-bucket",
|
||||||
|
operation="REST.PUT.OBJECT",
|
||||||
|
key="uploaded.txt",
|
||||||
|
remote_ip="192.168.1.100",
|
||||||
|
requester="test-user",
|
||||||
|
http_status=200,
|
||||||
|
bytes_sent=1024,
|
||||||
|
)
|
||||||
|
|
||||||
|
stats = logging_service.get_stats()
|
||||||
|
assert stats["buffered_entries"] == 1
|
||||||
|
|
||||||
|
def test_log_request_disabled_config(self, logging_service):
|
||||||
|
config = LoggingConfiguration(
|
||||||
|
target_bucket="logs",
|
||||||
|
enabled=False,
|
||||||
|
)
|
||||||
|
logging_service.set_bucket_logging("disabled-bucket", config)
|
||||||
|
|
||||||
|
logging_service.log_request(
|
||||||
|
"disabled-bucket",
|
||||||
|
operation="REST.GET.OBJECT",
|
||||||
|
key="test.txt",
|
||||||
|
)
|
||||||
|
|
||||||
|
stats = logging_service.get_stats()
|
||||||
|
assert stats["buffered_entries"] == 0
|
||||||
|
|
||||||
|
def test_flush_buffer(self, logging_service, storage):
|
||||||
|
storage.create_bucket("flush-target")
|
||||||
|
|
||||||
|
config = LoggingConfiguration(
|
||||||
|
target_bucket="flush-target",
|
||||||
|
target_prefix="logs/",
|
||||||
|
)
|
||||||
|
logging_service.set_bucket_logging("flush-source", config)
|
||||||
|
|
||||||
|
for i in range(3):
|
||||||
|
logging_service.log_request(
|
||||||
|
"flush-source",
|
||||||
|
operation="REST.GET.OBJECT",
|
||||||
|
key=f"file{i}.txt",
|
||||||
|
)
|
||||||
|
|
||||||
|
logging_service.flush()
|
||||||
|
|
||||||
|
objects = storage.list_objects_all("flush-target")
|
||||||
|
assert len(objects) >= 1
|
||||||
|
|
||||||
|
def test_auto_flush_on_buffer_size(self, logging_service, storage):
|
||||||
|
storage.create_bucket("auto-flush-target")
|
||||||
|
|
||||||
|
config = LoggingConfiguration(
|
||||||
|
target_bucket="auto-flush-target",
|
||||||
|
target_prefix="",
|
||||||
|
)
|
||||||
|
logging_service.set_bucket_logging("auto-source", config)
|
||||||
|
|
||||||
|
for i in range(15):
|
||||||
|
logging_service.log_request(
|
||||||
|
"auto-source",
|
||||||
|
operation="REST.GET.OBJECT",
|
||||||
|
key=f"file{i}.txt",
|
||||||
|
)
|
||||||
|
|
||||||
|
objects = storage.list_objects_all("auto-flush-target")
|
||||||
|
assert len(objects) >= 1
|
||||||
|
|
||||||
|
def test_get_stats(self, logging_service, storage):
|
||||||
|
storage.create_bucket("stats-target")
|
||||||
|
config = LoggingConfiguration(target_bucket="stats-target")
|
||||||
|
logging_service.set_bucket_logging("stats-bucket", config)
|
||||||
|
|
||||||
|
logging_service.log_request(
|
||||||
|
"stats-bucket",
|
||||||
|
operation="REST.GET.OBJECT",
|
||||||
|
key="test.txt",
|
||||||
|
)
|
||||||
|
|
||||||
|
stats = logging_service.get_stats()
|
||||||
|
assert "buffered_entries" in stats
|
||||||
|
assert "target_buckets" in stats
|
||||||
|
assert stats["buffered_entries"] >= 1
|
||||||
|
|
||||||
|
def test_shutdown_flushes_buffer(self, tmp_path, storage):
|
||||||
|
storage.create_bucket("shutdown-target")
|
||||||
|
|
||||||
|
service = AccessLoggingService(tmp_path, flush_interval=3600, max_buffer_size=100)
|
||||||
|
service.set_storage(storage)
|
||||||
|
|
||||||
|
config = LoggingConfiguration(target_bucket="shutdown-target")
|
||||||
|
service.set_bucket_logging("shutdown-source", config)
|
||||||
|
|
||||||
|
service.log_request(
|
||||||
|
"shutdown-source",
|
||||||
|
operation="REST.PUT.OBJECT",
|
||||||
|
key="final.txt",
|
||||||
|
)
|
||||||
|
|
||||||
|
service.shutdown()
|
||||||
|
|
||||||
|
objects = storage.list_objects_all("shutdown-target")
|
||||||
|
assert len(objects) >= 1
|
||||||
|
|
||||||
|
def test_logging_caching(self, logging_service):
|
||||||
|
config = LoggingConfiguration(target_bucket="cached-logs")
|
||||||
|
logging_service.set_bucket_logging("cached-bucket", config)
|
||||||
|
|
||||||
|
logging_service.get_bucket_logging("cached-bucket")
|
||||||
|
assert "cached-bucket" in logging_service._configs
|
||||||
|
|
||||||
|
def test_log_request_all_fields(self, logging_service, storage):
|
||||||
|
storage.create_bucket("detailed-target")
|
||||||
|
|
||||||
|
config = LoggingConfiguration(target_bucket="detailed-target", target_prefix="detailed/")
|
||||||
|
logging_service.set_bucket_logging("detailed-source", config)
|
||||||
|
|
||||||
|
logging_service.log_request(
|
||||||
|
"detailed-source",
|
||||||
|
operation="REST.PUT.OBJECT",
|
||||||
|
key="detailed/file.txt",
|
||||||
|
remote_ip="10.0.0.1",
|
||||||
|
requester="admin-user",
|
||||||
|
request_uri="PUT /detailed-source/detailed/file.txt HTTP/1.1",
|
||||||
|
http_status=201,
|
||||||
|
error_code="",
|
||||||
|
bytes_sent=2048,
|
||||||
|
object_size=2048,
|
||||||
|
total_time_ms=100,
|
||||||
|
referrer="http://admin.example.com",
|
||||||
|
user_agent="curl/7.68.0",
|
||||||
|
version_id="v1.0",
|
||||||
|
request_id="CUSTOM_REQ_ID",
|
||||||
|
)
|
||||||
|
|
||||||
|
stats = logging_service.get_stats()
|
||||||
|
assert stats["buffered_entries"] == 1
|
||||||
|
|
||||||
|
def test_failed_flush_returns_to_buffer(self, logging_service):
|
||||||
|
config = LoggingConfiguration(target_bucket="nonexistent-target")
|
||||||
|
logging_service.set_bucket_logging("fail-source", config)
|
||||||
|
|
||||||
|
logging_service.log_request(
|
||||||
|
"fail-source",
|
||||||
|
operation="REST.GET.OBJECT",
|
||||||
|
key="test.txt",
|
||||||
|
)
|
||||||
|
|
||||||
|
initial_count = logging_service.get_stats()["buffered_entries"]
|
||||||
|
logging_service.flush()
|
||||||
|
|
||||||
|
final_count = logging_service.get_stats()["buffered_entries"]
|
||||||
|
assert final_count >= initial_count
|
||||||
284
tests/test_acl.py
Normal file
284
tests/test_acl.py
Normal file
@@ -0,0 +1,284 @@
|
|||||||
|
import json
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from app.acl import (
|
||||||
|
Acl,
|
||||||
|
AclGrant,
|
||||||
|
AclService,
|
||||||
|
ACL_PERMISSION_FULL_CONTROL,
|
||||||
|
ACL_PERMISSION_READ,
|
||||||
|
ACL_PERMISSION_WRITE,
|
||||||
|
ACL_PERMISSION_READ_ACP,
|
||||||
|
ACL_PERMISSION_WRITE_ACP,
|
||||||
|
GRANTEE_ALL_USERS,
|
||||||
|
GRANTEE_AUTHENTICATED_USERS,
|
||||||
|
PERMISSION_TO_ACTIONS,
|
||||||
|
create_canned_acl,
|
||||||
|
CANNED_ACLS,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class TestAclGrant:
|
||||||
|
def test_to_dict(self):
|
||||||
|
grant = AclGrant(grantee="user123", permission=ACL_PERMISSION_READ)
|
||||||
|
result = grant.to_dict()
|
||||||
|
assert result == {"grantee": "user123", "permission": "READ"}
|
||||||
|
|
||||||
|
def test_from_dict(self):
|
||||||
|
data = {"grantee": "admin", "permission": "FULL_CONTROL"}
|
||||||
|
grant = AclGrant.from_dict(data)
|
||||||
|
assert grant.grantee == "admin"
|
||||||
|
assert grant.permission == ACL_PERMISSION_FULL_CONTROL
|
||||||
|
|
||||||
|
|
||||||
|
class TestAcl:
|
||||||
|
def test_to_dict(self):
|
||||||
|
acl = Acl(
|
||||||
|
owner="owner-user",
|
||||||
|
grants=[
|
||||||
|
AclGrant(grantee="owner-user", permission=ACL_PERMISSION_FULL_CONTROL),
|
||||||
|
AclGrant(grantee=GRANTEE_ALL_USERS, permission=ACL_PERMISSION_READ),
|
||||||
|
],
|
||||||
|
)
|
||||||
|
result = acl.to_dict()
|
||||||
|
assert result["owner"] == "owner-user"
|
||||||
|
assert len(result["grants"]) == 2
|
||||||
|
assert result["grants"][0]["grantee"] == "owner-user"
|
||||||
|
assert result["grants"][1]["grantee"] == "*"
|
||||||
|
|
||||||
|
def test_from_dict(self):
|
||||||
|
data = {
|
||||||
|
"owner": "the-owner",
|
||||||
|
"grants": [
|
||||||
|
{"grantee": "the-owner", "permission": "FULL_CONTROL"},
|
||||||
|
{"grantee": "authenticated", "permission": "READ"},
|
||||||
|
],
|
||||||
|
}
|
||||||
|
acl = Acl.from_dict(data)
|
||||||
|
assert acl.owner == "the-owner"
|
||||||
|
assert len(acl.grants) == 2
|
||||||
|
assert acl.grants[0].grantee == "the-owner"
|
||||||
|
assert acl.grants[1].grantee == GRANTEE_AUTHENTICATED_USERS
|
||||||
|
|
||||||
|
def test_from_dict_empty_grants(self):
|
||||||
|
data = {"owner": "solo-owner"}
|
||||||
|
acl = Acl.from_dict(data)
|
||||||
|
assert acl.owner == "solo-owner"
|
||||||
|
assert len(acl.grants) == 0
|
||||||
|
|
||||||
|
def test_get_allowed_actions_owner(self):
|
||||||
|
acl = Acl(owner="owner123", grants=[])
|
||||||
|
actions = acl.get_allowed_actions("owner123", is_authenticated=True)
|
||||||
|
assert actions == PERMISSION_TO_ACTIONS[ACL_PERMISSION_FULL_CONTROL]
|
||||||
|
|
||||||
|
def test_get_allowed_actions_all_users(self):
|
||||||
|
acl = Acl(
|
||||||
|
owner="owner",
|
||||||
|
grants=[AclGrant(grantee=GRANTEE_ALL_USERS, permission=ACL_PERMISSION_READ)],
|
||||||
|
)
|
||||||
|
actions = acl.get_allowed_actions(None, is_authenticated=False)
|
||||||
|
assert "read" in actions
|
||||||
|
assert "list" in actions
|
||||||
|
assert "write" not in actions
|
||||||
|
|
||||||
|
def test_get_allowed_actions_authenticated_users(self):
|
||||||
|
acl = Acl(
|
||||||
|
owner="owner",
|
||||||
|
grants=[AclGrant(grantee=GRANTEE_AUTHENTICATED_USERS, permission=ACL_PERMISSION_WRITE)],
|
||||||
|
)
|
||||||
|
actions_authenticated = acl.get_allowed_actions("some-user", is_authenticated=True)
|
||||||
|
assert "write" in actions_authenticated
|
||||||
|
assert "delete" in actions_authenticated
|
||||||
|
|
||||||
|
actions_anonymous = acl.get_allowed_actions(None, is_authenticated=False)
|
||||||
|
assert "write" not in actions_anonymous
|
||||||
|
|
||||||
|
def test_get_allowed_actions_specific_grantee(self):
|
||||||
|
acl = Acl(
|
||||||
|
owner="owner",
|
||||||
|
grants=[
|
||||||
|
AclGrant(grantee="user-abc", permission=ACL_PERMISSION_READ),
|
||||||
|
AclGrant(grantee="user-xyz", permission=ACL_PERMISSION_WRITE),
|
||||||
|
],
|
||||||
|
)
|
||||||
|
abc_actions = acl.get_allowed_actions("user-abc", is_authenticated=True)
|
||||||
|
assert "read" in abc_actions
|
||||||
|
assert "list" in abc_actions
|
||||||
|
assert "write" not in abc_actions
|
||||||
|
|
||||||
|
xyz_actions = acl.get_allowed_actions("user-xyz", is_authenticated=True)
|
||||||
|
assert "write" in xyz_actions
|
||||||
|
assert "read" not in xyz_actions
|
||||||
|
|
||||||
|
def test_get_allowed_actions_combined(self):
|
||||||
|
acl = Acl(
|
||||||
|
owner="owner",
|
||||||
|
grants=[
|
||||||
|
AclGrant(grantee=GRANTEE_ALL_USERS, permission=ACL_PERMISSION_READ),
|
||||||
|
AclGrant(grantee="special-user", permission=ACL_PERMISSION_WRITE),
|
||||||
|
],
|
||||||
|
)
|
||||||
|
actions = acl.get_allowed_actions("special-user", is_authenticated=True)
|
||||||
|
assert "read" in actions
|
||||||
|
assert "list" in actions
|
||||||
|
assert "write" in actions
|
||||||
|
assert "delete" in actions
|
||||||
|
|
||||||
|
|
||||||
|
class TestCannedAcls:
|
||||||
|
def test_private_acl(self):
|
||||||
|
acl = create_canned_acl("private", "the-owner")
|
||||||
|
assert acl.owner == "the-owner"
|
||||||
|
assert len(acl.grants) == 1
|
||||||
|
assert acl.grants[0].grantee == "the-owner"
|
||||||
|
assert acl.grants[0].permission == ACL_PERMISSION_FULL_CONTROL
|
||||||
|
|
||||||
|
def test_public_read_acl(self):
|
||||||
|
acl = create_canned_acl("public-read", "owner")
|
||||||
|
assert acl.owner == "owner"
|
||||||
|
has_owner_full_control = any(
|
||||||
|
g.grantee == "owner" and g.permission == ACL_PERMISSION_FULL_CONTROL for g in acl.grants
|
||||||
|
)
|
||||||
|
has_public_read = any(
|
||||||
|
g.grantee == GRANTEE_ALL_USERS and g.permission == ACL_PERMISSION_READ for g in acl.grants
|
||||||
|
)
|
||||||
|
assert has_owner_full_control
|
||||||
|
assert has_public_read
|
||||||
|
|
||||||
|
def test_public_read_write_acl(self):
|
||||||
|
acl = create_canned_acl("public-read-write", "owner")
|
||||||
|
assert acl.owner == "owner"
|
||||||
|
has_public_read = any(
|
||||||
|
g.grantee == GRANTEE_ALL_USERS and g.permission == ACL_PERMISSION_READ for g in acl.grants
|
||||||
|
)
|
||||||
|
has_public_write = any(
|
||||||
|
g.grantee == GRANTEE_ALL_USERS and g.permission == ACL_PERMISSION_WRITE for g in acl.grants
|
||||||
|
)
|
||||||
|
assert has_public_read
|
||||||
|
assert has_public_write
|
||||||
|
|
||||||
|
def test_authenticated_read_acl(self):
|
||||||
|
acl = create_canned_acl("authenticated-read", "owner")
|
||||||
|
has_authenticated_read = any(
|
||||||
|
g.grantee == GRANTEE_AUTHENTICATED_USERS and g.permission == ACL_PERMISSION_READ for g in acl.grants
|
||||||
|
)
|
||||||
|
assert has_authenticated_read
|
||||||
|
|
||||||
|
def test_unknown_canned_acl_defaults_to_private(self):
|
||||||
|
acl = create_canned_acl("unknown-acl", "owner")
|
||||||
|
private_acl = create_canned_acl("private", "owner")
|
||||||
|
assert acl.to_dict() == private_acl.to_dict()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def acl_service(tmp_path: Path):
|
||||||
|
return AclService(tmp_path)
|
||||||
|
|
||||||
|
|
||||||
|
class TestAclService:
|
||||||
|
def test_get_bucket_acl_not_exists(self, acl_service):
|
||||||
|
result = acl_service.get_bucket_acl("nonexistent-bucket")
|
||||||
|
assert result is None
|
||||||
|
|
||||||
|
def test_set_and_get_bucket_acl(self, acl_service):
|
||||||
|
acl = Acl(
|
||||||
|
owner="bucket-owner",
|
||||||
|
grants=[AclGrant(grantee="bucket-owner", permission=ACL_PERMISSION_FULL_CONTROL)],
|
||||||
|
)
|
||||||
|
acl_service.set_bucket_acl("my-bucket", acl)
|
||||||
|
|
||||||
|
retrieved = acl_service.get_bucket_acl("my-bucket")
|
||||||
|
assert retrieved is not None
|
||||||
|
assert retrieved.owner == "bucket-owner"
|
||||||
|
assert len(retrieved.grants) == 1
|
||||||
|
|
||||||
|
def test_bucket_acl_caching(self, acl_service):
|
||||||
|
acl = Acl(owner="cached-owner", grants=[])
|
||||||
|
acl_service.set_bucket_acl("cached-bucket", acl)
|
||||||
|
|
||||||
|
acl_service.get_bucket_acl("cached-bucket")
|
||||||
|
assert "cached-bucket" in acl_service._bucket_acl_cache
|
||||||
|
|
||||||
|
retrieved = acl_service.get_bucket_acl("cached-bucket")
|
||||||
|
assert retrieved.owner == "cached-owner"
|
||||||
|
|
||||||
|
def test_set_bucket_canned_acl(self, acl_service):
|
||||||
|
result = acl_service.set_bucket_canned_acl("new-bucket", "public-read", "the-owner")
|
||||||
|
assert result.owner == "the-owner"
|
||||||
|
|
||||||
|
retrieved = acl_service.get_bucket_acl("new-bucket")
|
||||||
|
assert retrieved is not None
|
||||||
|
has_public_read = any(
|
||||||
|
g.grantee == GRANTEE_ALL_USERS and g.permission == ACL_PERMISSION_READ for g in retrieved.grants
|
||||||
|
)
|
||||||
|
assert has_public_read
|
||||||
|
|
||||||
|
def test_delete_bucket_acl(self, acl_service):
|
||||||
|
acl = Acl(owner="to-delete-owner", grants=[])
|
||||||
|
acl_service.set_bucket_acl("delete-me", acl)
|
||||||
|
assert acl_service.get_bucket_acl("delete-me") is not None
|
||||||
|
|
||||||
|
acl_service.delete_bucket_acl("delete-me")
|
||||||
|
acl_service._bucket_acl_cache.clear()
|
||||||
|
assert acl_service.get_bucket_acl("delete-me") is None
|
||||||
|
|
||||||
|
def test_evaluate_bucket_acl_allowed(self, acl_service):
|
||||||
|
acl = Acl(
|
||||||
|
owner="owner",
|
||||||
|
grants=[AclGrant(grantee=GRANTEE_ALL_USERS, permission=ACL_PERMISSION_READ)],
|
||||||
|
)
|
||||||
|
acl_service.set_bucket_acl("public-bucket", acl)
|
||||||
|
|
||||||
|
result = acl_service.evaluate_bucket_acl("public-bucket", None, "read", is_authenticated=False)
|
||||||
|
assert result is True
|
||||||
|
|
||||||
|
def test_evaluate_bucket_acl_denied(self, acl_service):
|
||||||
|
acl = Acl(
|
||||||
|
owner="owner",
|
||||||
|
grants=[AclGrant(grantee="owner", permission=ACL_PERMISSION_FULL_CONTROL)],
|
||||||
|
)
|
||||||
|
acl_service.set_bucket_acl("private-bucket", acl)
|
||||||
|
|
||||||
|
result = acl_service.evaluate_bucket_acl("private-bucket", "other-user", "write", is_authenticated=True)
|
||||||
|
assert result is False
|
||||||
|
|
||||||
|
def test_evaluate_bucket_acl_no_acl(self, acl_service):
|
||||||
|
result = acl_service.evaluate_bucket_acl("no-acl-bucket", "anyone", "read")
|
||||||
|
assert result is False
|
||||||
|
|
||||||
|
def test_get_object_acl_from_metadata(self, acl_service):
|
||||||
|
metadata = {
|
||||||
|
"__acl__": {
|
||||||
|
"owner": "object-owner",
|
||||||
|
"grants": [{"grantee": "object-owner", "permission": "FULL_CONTROL"}],
|
||||||
|
}
|
||||||
|
}
|
||||||
|
result = acl_service.get_object_acl("bucket", "key", metadata)
|
||||||
|
assert result is not None
|
||||||
|
assert result.owner == "object-owner"
|
||||||
|
|
||||||
|
def test_get_object_acl_no_acl_in_metadata(self, acl_service):
|
||||||
|
metadata = {"Content-Type": "text/plain"}
|
||||||
|
result = acl_service.get_object_acl("bucket", "key", metadata)
|
||||||
|
assert result is None
|
||||||
|
|
||||||
|
def test_create_object_acl_metadata(self, acl_service):
|
||||||
|
acl = Acl(owner="obj-owner", grants=[])
|
||||||
|
result = acl_service.create_object_acl_metadata(acl)
|
||||||
|
assert "__acl__" in result
|
||||||
|
assert result["__acl__"]["owner"] == "obj-owner"
|
||||||
|
|
||||||
|
def test_evaluate_object_acl(self, acl_service):
|
||||||
|
metadata = {
|
||||||
|
"__acl__": {
|
||||||
|
"owner": "obj-owner",
|
||||||
|
"grants": [{"grantee": "*", "permission": "READ"}],
|
||||||
|
}
|
||||||
|
}
|
||||||
|
result = acl_service.evaluate_object_acl(metadata, None, "read", is_authenticated=False)
|
||||||
|
assert result is True
|
||||||
|
|
||||||
|
result = acl_service.evaluate_object_acl(metadata, None, "write", is_authenticated=False)
|
||||||
|
assert result is False
|
||||||
238
tests/test_lifecycle.py
Normal file
238
tests/test_lifecycle.py
Normal file
@@ -0,0 +1,238 @@
|
|||||||
|
import io
|
||||||
|
import time
|
||||||
|
from datetime import datetime, timedelta, timezone
|
||||||
|
from pathlib import Path
|
||||||
|
from unittest.mock import MagicMock, patch
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from app.lifecycle import LifecycleManager, LifecycleResult
|
||||||
|
from app.storage import ObjectStorage
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def storage(tmp_path: Path):
|
||||||
|
storage_root = tmp_path / "data"
|
||||||
|
storage_root.mkdir(parents=True)
|
||||||
|
return ObjectStorage(storage_root)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def lifecycle_manager(storage):
|
||||||
|
manager = LifecycleManager(storage, interval_seconds=3600)
|
||||||
|
yield manager
|
||||||
|
manager.stop()
|
||||||
|
|
||||||
|
|
||||||
|
class TestLifecycleResult:
|
||||||
|
def test_default_values(self):
|
||||||
|
result = LifecycleResult(bucket_name="test-bucket")
|
||||||
|
assert result.bucket_name == "test-bucket"
|
||||||
|
assert result.objects_deleted == 0
|
||||||
|
assert result.versions_deleted == 0
|
||||||
|
assert result.uploads_aborted == 0
|
||||||
|
assert result.errors == []
|
||||||
|
assert result.execution_time_seconds == 0.0
|
||||||
|
|
||||||
|
|
||||||
|
class TestLifecycleManager:
|
||||||
|
def test_start_and_stop(self, lifecycle_manager):
|
||||||
|
lifecycle_manager.start()
|
||||||
|
assert lifecycle_manager._timer is not None
|
||||||
|
assert lifecycle_manager._shutdown is False
|
||||||
|
|
||||||
|
lifecycle_manager.stop()
|
||||||
|
assert lifecycle_manager._shutdown is True
|
||||||
|
assert lifecycle_manager._timer is None
|
||||||
|
|
||||||
|
def test_start_only_once(self, lifecycle_manager):
|
||||||
|
lifecycle_manager.start()
|
||||||
|
first_timer = lifecycle_manager._timer
|
||||||
|
|
||||||
|
lifecycle_manager.start()
|
||||||
|
assert lifecycle_manager._timer is first_timer
|
||||||
|
|
||||||
|
def test_enforce_rules_no_lifecycle(self, lifecycle_manager, storage):
|
||||||
|
storage.create_bucket("no-lifecycle-bucket")
|
||||||
|
|
||||||
|
result = lifecycle_manager.enforce_rules("no-lifecycle-bucket")
|
||||||
|
assert result.bucket_name == "no-lifecycle-bucket"
|
||||||
|
assert result.objects_deleted == 0
|
||||||
|
|
||||||
|
def test_enforce_rules_disabled_rule(self, lifecycle_manager, storage):
|
||||||
|
storage.create_bucket("disabled-bucket")
|
||||||
|
storage.set_bucket_lifecycle("disabled-bucket", [
|
||||||
|
{
|
||||||
|
"ID": "disabled-rule",
|
||||||
|
"Status": "Disabled",
|
||||||
|
"Prefix": "",
|
||||||
|
"Expiration": {"Days": 1},
|
||||||
|
}
|
||||||
|
])
|
||||||
|
|
||||||
|
old_object = storage.put_object(
|
||||||
|
"disabled-bucket",
|
||||||
|
"old-file.txt",
|
||||||
|
io.BytesIO(b"old content"),
|
||||||
|
)
|
||||||
|
|
||||||
|
result = lifecycle_manager.enforce_rules("disabled-bucket")
|
||||||
|
assert result.objects_deleted == 0
|
||||||
|
|
||||||
|
def test_enforce_expiration_by_days(self, lifecycle_manager, storage):
|
||||||
|
storage.create_bucket("expire-bucket")
|
||||||
|
storage.set_bucket_lifecycle("expire-bucket", [
|
||||||
|
{
|
||||||
|
"ID": "expire-30-days",
|
||||||
|
"Status": "Enabled",
|
||||||
|
"Prefix": "",
|
||||||
|
"Expiration": {"Days": 30},
|
||||||
|
}
|
||||||
|
])
|
||||||
|
|
||||||
|
storage.put_object(
|
||||||
|
"expire-bucket",
|
||||||
|
"recent-file.txt",
|
||||||
|
io.BytesIO(b"recent content"),
|
||||||
|
)
|
||||||
|
|
||||||
|
result = lifecycle_manager.enforce_rules("expire-bucket")
|
||||||
|
assert result.objects_deleted == 0
|
||||||
|
|
||||||
|
def test_enforce_expiration_with_prefix(self, lifecycle_manager, storage):
|
||||||
|
storage.create_bucket("prefix-bucket")
|
||||||
|
storage.set_bucket_lifecycle("prefix-bucket", [
|
||||||
|
{
|
||||||
|
"ID": "expire-logs",
|
||||||
|
"Status": "Enabled",
|
||||||
|
"Prefix": "logs/",
|
||||||
|
"Expiration": {"Days": 1},
|
||||||
|
}
|
||||||
|
])
|
||||||
|
|
||||||
|
storage.put_object("prefix-bucket", "logs/old.log", io.BytesIO(b"log data"))
|
||||||
|
storage.put_object("prefix-bucket", "data/keep.txt", io.BytesIO(b"keep this"))
|
||||||
|
|
||||||
|
result = lifecycle_manager.enforce_rules("prefix-bucket")
|
||||||
|
|
||||||
|
def test_enforce_all_buckets(self, lifecycle_manager, storage):
|
||||||
|
storage.create_bucket("bucket1")
|
||||||
|
storage.create_bucket("bucket2")
|
||||||
|
|
||||||
|
results = lifecycle_manager.enforce_all_buckets()
|
||||||
|
assert isinstance(results, dict)
|
||||||
|
|
||||||
|
def test_run_now_single_bucket(self, lifecycle_manager, storage):
|
||||||
|
storage.create_bucket("run-now-bucket")
|
||||||
|
|
||||||
|
results = lifecycle_manager.run_now("run-now-bucket")
|
||||||
|
assert "run-now-bucket" in results
|
||||||
|
|
||||||
|
def test_run_now_all_buckets(self, lifecycle_manager, storage):
|
||||||
|
storage.create_bucket("all-bucket-1")
|
||||||
|
storage.create_bucket("all-bucket-2")
|
||||||
|
|
||||||
|
results = lifecycle_manager.run_now()
|
||||||
|
assert isinstance(results, dict)
|
||||||
|
|
||||||
|
def test_enforce_abort_multipart(self, lifecycle_manager, storage):
|
||||||
|
storage.create_bucket("multipart-bucket")
|
||||||
|
storage.set_bucket_lifecycle("multipart-bucket", [
|
||||||
|
{
|
||||||
|
"ID": "abort-old-uploads",
|
||||||
|
"Status": "Enabled",
|
||||||
|
"Prefix": "",
|
||||||
|
"AbortIncompleteMultipartUpload": {"DaysAfterInitiation": 7},
|
||||||
|
}
|
||||||
|
])
|
||||||
|
|
||||||
|
upload_id = storage.initiate_multipart_upload("multipart-bucket", "large-file.bin")
|
||||||
|
|
||||||
|
result = lifecycle_manager.enforce_rules("multipart-bucket")
|
||||||
|
assert result.uploads_aborted == 0
|
||||||
|
|
||||||
|
def test_enforce_noncurrent_version_expiration(self, lifecycle_manager, storage):
|
||||||
|
storage.create_bucket("versioned-bucket")
|
||||||
|
storage.set_bucket_versioning("versioned-bucket", True)
|
||||||
|
storage.set_bucket_lifecycle("versioned-bucket", [
|
||||||
|
{
|
||||||
|
"ID": "expire-old-versions",
|
||||||
|
"Status": "Enabled",
|
||||||
|
"Prefix": "",
|
||||||
|
"NoncurrentVersionExpiration": {"NoncurrentDays": 30},
|
||||||
|
}
|
||||||
|
])
|
||||||
|
|
||||||
|
storage.put_object("versioned-bucket", "file.txt", io.BytesIO(b"v1"))
|
||||||
|
storage.put_object("versioned-bucket", "file.txt", io.BytesIO(b"v2"))
|
||||||
|
|
||||||
|
result = lifecycle_manager.enforce_rules("versioned-bucket")
|
||||||
|
assert result.bucket_name == "versioned-bucket"
|
||||||
|
|
||||||
|
def test_execution_time_tracking(self, lifecycle_manager, storage):
|
||||||
|
storage.create_bucket("timed-bucket")
|
||||||
|
storage.set_bucket_lifecycle("timed-bucket", [
|
||||||
|
{
|
||||||
|
"ID": "timer-test",
|
||||||
|
"Status": "Enabled",
|
||||||
|
"Expiration": {"Days": 1},
|
||||||
|
}
|
||||||
|
])
|
||||||
|
|
||||||
|
result = lifecycle_manager.enforce_rules("timed-bucket")
|
||||||
|
assert result.execution_time_seconds >= 0
|
||||||
|
|
||||||
|
def test_enforce_rules_with_error(self, lifecycle_manager, storage):
|
||||||
|
result = lifecycle_manager.enforce_rules("nonexistent-bucket")
|
||||||
|
assert len(result.errors) > 0 or result.objects_deleted == 0
|
||||||
|
|
||||||
|
def test_lifecycle_with_date_expiration(self, lifecycle_manager, storage):
|
||||||
|
storage.create_bucket("date-bucket")
|
||||||
|
past_date = (datetime.now(timezone.utc) - timedelta(days=1)).strftime("%Y-%m-%dT00:00:00Z")
|
||||||
|
storage.set_bucket_lifecycle("date-bucket", [
|
||||||
|
{
|
||||||
|
"ID": "expire-by-date",
|
||||||
|
"Status": "Enabled",
|
||||||
|
"Prefix": "",
|
||||||
|
"Expiration": {"Date": past_date},
|
||||||
|
}
|
||||||
|
])
|
||||||
|
|
||||||
|
storage.put_object("date-bucket", "should-expire.txt", io.BytesIO(b"content"))
|
||||||
|
|
||||||
|
result = lifecycle_manager.enforce_rules("date-bucket")
|
||||||
|
|
||||||
|
def test_enforce_with_filter_prefix(self, lifecycle_manager, storage):
|
||||||
|
storage.create_bucket("filter-bucket")
|
||||||
|
storage.set_bucket_lifecycle("filter-bucket", [
|
||||||
|
{
|
||||||
|
"ID": "filter-prefix-rule",
|
||||||
|
"Status": "Enabled",
|
||||||
|
"Filter": {"Prefix": "archive/"},
|
||||||
|
"Expiration": {"Days": 1},
|
||||||
|
}
|
||||||
|
])
|
||||||
|
|
||||||
|
result = lifecycle_manager.enforce_rules("filter-bucket")
|
||||||
|
assert result.bucket_name == "filter-bucket"
|
||||||
|
|
||||||
|
|
||||||
|
class TestLifecycleManagerScheduling:
|
||||||
|
def test_schedule_next_respects_shutdown(self, storage):
|
||||||
|
manager = LifecycleManager(storage, interval_seconds=1)
|
||||||
|
manager._shutdown = True
|
||||||
|
manager._schedule_next()
|
||||||
|
assert manager._timer is None
|
||||||
|
|
||||||
|
@patch.object(LifecycleManager, "enforce_all_buckets")
|
||||||
|
def test_run_enforcement_catches_exceptions(self, mock_enforce, storage):
|
||||||
|
mock_enforce.side_effect = Exception("Test error")
|
||||||
|
manager = LifecycleManager(storage, interval_seconds=3600)
|
||||||
|
manager._shutdown = True
|
||||||
|
manager._run_enforcement()
|
||||||
|
|
||||||
|
def test_shutdown_flag_prevents_scheduling(self, storage):
|
||||||
|
manager = LifecycleManager(storage, interval_seconds=1)
|
||||||
|
manager.start()
|
||||||
|
manager.stop()
|
||||||
|
assert manager._shutdown is True
|
||||||
374
tests/test_notifications.py
Normal file
374
tests/test_notifications.py
Normal file
@@ -0,0 +1,374 @@
|
|||||||
|
import json
|
||||||
|
import time
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
from pathlib import Path
|
||||||
|
from unittest.mock import MagicMock, patch
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from app.notifications import (
|
||||||
|
NotificationConfiguration,
|
||||||
|
NotificationEvent,
|
||||||
|
NotificationService,
|
||||||
|
WebhookDestination,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class TestNotificationEvent:
|
||||||
|
def test_default_values(self):
|
||||||
|
event = NotificationEvent(
|
||||||
|
event_name="s3:ObjectCreated:Put",
|
||||||
|
bucket_name="test-bucket",
|
||||||
|
object_key="test/key.txt",
|
||||||
|
)
|
||||||
|
assert event.event_name == "s3:ObjectCreated:Put"
|
||||||
|
assert event.bucket_name == "test-bucket"
|
||||||
|
assert event.object_key == "test/key.txt"
|
||||||
|
assert event.object_size == 0
|
||||||
|
assert event.etag == ""
|
||||||
|
assert event.version_id is None
|
||||||
|
assert event.request_id != ""
|
||||||
|
|
||||||
|
def test_to_s3_event(self):
|
||||||
|
event = NotificationEvent(
|
||||||
|
event_name="s3:ObjectCreated:Put",
|
||||||
|
bucket_name="my-bucket",
|
||||||
|
object_key="my/object.txt",
|
||||||
|
object_size=1024,
|
||||||
|
etag="abc123",
|
||||||
|
version_id="v1",
|
||||||
|
source_ip="192.168.1.1",
|
||||||
|
user_identity="user123",
|
||||||
|
)
|
||||||
|
result = event.to_s3_event()
|
||||||
|
|
||||||
|
assert "Records" in result
|
||||||
|
assert len(result["Records"]) == 1
|
||||||
|
|
||||||
|
record = result["Records"][0]
|
||||||
|
assert record["eventVersion"] == "2.1"
|
||||||
|
assert record["eventSource"] == "myfsio:s3"
|
||||||
|
assert record["eventName"] == "s3:ObjectCreated:Put"
|
||||||
|
assert record["s3"]["bucket"]["name"] == "my-bucket"
|
||||||
|
assert record["s3"]["object"]["key"] == "my/object.txt"
|
||||||
|
assert record["s3"]["object"]["size"] == 1024
|
||||||
|
assert record["s3"]["object"]["eTag"] == "abc123"
|
||||||
|
assert record["s3"]["object"]["versionId"] == "v1"
|
||||||
|
assert record["userIdentity"]["principalId"] == "user123"
|
||||||
|
assert record["requestParameters"]["sourceIPAddress"] == "192.168.1.1"
|
||||||
|
|
||||||
|
|
||||||
|
class TestWebhookDestination:
|
||||||
|
def test_default_values(self):
|
||||||
|
dest = WebhookDestination(url="http://example.com/webhook")
|
||||||
|
assert dest.url == "http://example.com/webhook"
|
||||||
|
assert dest.headers == {}
|
||||||
|
assert dest.timeout_seconds == 30
|
||||||
|
assert dest.retry_count == 3
|
||||||
|
assert dest.retry_delay_seconds == 1
|
||||||
|
|
||||||
|
def test_to_dict(self):
|
||||||
|
dest = WebhookDestination(
|
||||||
|
url="http://example.com/webhook",
|
||||||
|
headers={"X-Custom": "value"},
|
||||||
|
timeout_seconds=60,
|
||||||
|
retry_count=5,
|
||||||
|
retry_delay_seconds=2,
|
||||||
|
)
|
||||||
|
result = dest.to_dict()
|
||||||
|
assert result["url"] == "http://example.com/webhook"
|
||||||
|
assert result["headers"] == {"X-Custom": "value"}
|
||||||
|
assert result["timeout_seconds"] == 60
|
||||||
|
assert result["retry_count"] == 5
|
||||||
|
assert result["retry_delay_seconds"] == 2
|
||||||
|
|
||||||
|
def test_from_dict(self):
|
||||||
|
data = {
|
||||||
|
"url": "http://hook.example.com",
|
||||||
|
"headers": {"Authorization": "Bearer token"},
|
||||||
|
"timeout_seconds": 45,
|
||||||
|
"retry_count": 2,
|
||||||
|
"retry_delay_seconds": 5,
|
||||||
|
}
|
||||||
|
dest = WebhookDestination.from_dict(data)
|
||||||
|
assert dest.url == "http://hook.example.com"
|
||||||
|
assert dest.headers == {"Authorization": "Bearer token"}
|
||||||
|
assert dest.timeout_seconds == 45
|
||||||
|
assert dest.retry_count == 2
|
||||||
|
assert dest.retry_delay_seconds == 5
|
||||||
|
|
||||||
|
|
||||||
|
class TestNotificationConfiguration:
|
||||||
|
def test_matches_event_exact_match(self):
|
||||||
|
config = NotificationConfiguration(
|
||||||
|
id="config1",
|
||||||
|
events=["s3:ObjectCreated:Put"],
|
||||||
|
destination=WebhookDestination(url="http://example.com"),
|
||||||
|
)
|
||||||
|
assert config.matches_event("s3:ObjectCreated:Put", "any/key.txt") is True
|
||||||
|
assert config.matches_event("s3:ObjectCreated:Post", "any/key.txt") is False
|
||||||
|
|
||||||
|
def test_matches_event_wildcard(self):
|
||||||
|
config = NotificationConfiguration(
|
||||||
|
id="config1",
|
||||||
|
events=["s3:ObjectCreated:*"],
|
||||||
|
destination=WebhookDestination(url="http://example.com"),
|
||||||
|
)
|
||||||
|
assert config.matches_event("s3:ObjectCreated:Put", "key.txt") is True
|
||||||
|
assert config.matches_event("s3:ObjectCreated:Copy", "key.txt") is True
|
||||||
|
assert config.matches_event("s3:ObjectRemoved:Delete", "key.txt") is False
|
||||||
|
|
||||||
|
def test_matches_event_with_prefix_filter(self):
|
||||||
|
config = NotificationConfiguration(
|
||||||
|
id="config1",
|
||||||
|
events=["s3:ObjectCreated:*"],
|
||||||
|
destination=WebhookDestination(url="http://example.com"),
|
||||||
|
prefix_filter="logs/",
|
||||||
|
)
|
||||||
|
assert config.matches_event("s3:ObjectCreated:Put", "logs/app.log") is True
|
||||||
|
assert config.matches_event("s3:ObjectCreated:Put", "data/file.txt") is False
|
||||||
|
|
||||||
|
def test_matches_event_with_suffix_filter(self):
|
||||||
|
config = NotificationConfiguration(
|
||||||
|
id="config1",
|
||||||
|
events=["s3:ObjectCreated:*"],
|
||||||
|
destination=WebhookDestination(url="http://example.com"),
|
||||||
|
suffix_filter=".jpg",
|
||||||
|
)
|
||||||
|
assert config.matches_event("s3:ObjectCreated:Put", "photos/image.jpg") is True
|
||||||
|
assert config.matches_event("s3:ObjectCreated:Put", "photos/image.png") is False
|
||||||
|
|
||||||
|
def test_matches_event_with_both_filters(self):
|
||||||
|
config = NotificationConfiguration(
|
||||||
|
id="config1",
|
||||||
|
events=["s3:ObjectCreated:*"],
|
||||||
|
destination=WebhookDestination(url="http://example.com"),
|
||||||
|
prefix_filter="images/",
|
||||||
|
suffix_filter=".png",
|
||||||
|
)
|
||||||
|
assert config.matches_event("s3:ObjectCreated:Put", "images/photo.png") is True
|
||||||
|
assert config.matches_event("s3:ObjectCreated:Put", "images/photo.jpg") is False
|
||||||
|
assert config.matches_event("s3:ObjectCreated:Put", "documents/file.png") is False
|
||||||
|
|
||||||
|
def test_to_dict(self):
|
||||||
|
config = NotificationConfiguration(
|
||||||
|
id="my-config",
|
||||||
|
events=["s3:ObjectCreated:Put", "s3:ObjectRemoved:Delete"],
|
||||||
|
destination=WebhookDestination(url="http://example.com"),
|
||||||
|
prefix_filter="logs/",
|
||||||
|
suffix_filter=".log",
|
||||||
|
)
|
||||||
|
result = config.to_dict()
|
||||||
|
assert result["Id"] == "my-config"
|
||||||
|
assert result["Events"] == ["s3:ObjectCreated:Put", "s3:ObjectRemoved:Delete"]
|
||||||
|
assert "Destination" in result
|
||||||
|
assert result["Filter"]["Key"]["FilterRules"][0]["Value"] == "logs/"
|
||||||
|
assert result["Filter"]["Key"]["FilterRules"][1]["Value"] == ".log"
|
||||||
|
|
||||||
|
def test_from_dict(self):
|
||||||
|
data = {
|
||||||
|
"Id": "parsed-config",
|
||||||
|
"Events": ["s3:ObjectCreated:*"],
|
||||||
|
"Destination": {"url": "http://hook.example.com"},
|
||||||
|
"Filter": {
|
||||||
|
"Key": {
|
||||||
|
"FilterRules": [
|
||||||
|
{"Name": "prefix", "Value": "data/"},
|
||||||
|
{"Name": "suffix", "Value": ".csv"},
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
}
|
||||||
|
config = NotificationConfiguration.from_dict(data)
|
||||||
|
assert config.id == "parsed-config"
|
||||||
|
assert config.events == ["s3:ObjectCreated:*"]
|
||||||
|
assert config.destination.url == "http://hook.example.com"
|
||||||
|
assert config.prefix_filter == "data/"
|
||||||
|
assert config.suffix_filter == ".csv"
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def notification_service(tmp_path: Path):
|
||||||
|
service = NotificationService(tmp_path, worker_count=1)
|
||||||
|
yield service
|
||||||
|
service.shutdown()
|
||||||
|
|
||||||
|
|
||||||
|
class TestNotificationService:
|
||||||
|
def test_get_bucket_notifications_empty(self, notification_service):
|
||||||
|
result = notification_service.get_bucket_notifications("nonexistent-bucket")
|
||||||
|
assert result == []
|
||||||
|
|
||||||
|
def test_set_and_get_bucket_notifications(self, notification_service):
|
||||||
|
configs = [
|
||||||
|
NotificationConfiguration(
|
||||||
|
id="config1",
|
||||||
|
events=["s3:ObjectCreated:*"],
|
||||||
|
destination=WebhookDestination(url="http://example.com/webhook1"),
|
||||||
|
),
|
||||||
|
NotificationConfiguration(
|
||||||
|
id="config2",
|
||||||
|
events=["s3:ObjectRemoved:*"],
|
||||||
|
destination=WebhookDestination(url="http://example.com/webhook2"),
|
||||||
|
),
|
||||||
|
]
|
||||||
|
notification_service.set_bucket_notifications("my-bucket", configs)
|
||||||
|
|
||||||
|
retrieved = notification_service.get_bucket_notifications("my-bucket")
|
||||||
|
assert len(retrieved) == 2
|
||||||
|
assert retrieved[0].id == "config1"
|
||||||
|
assert retrieved[1].id == "config2"
|
||||||
|
|
||||||
|
def test_delete_bucket_notifications(self, notification_service):
|
||||||
|
configs = [
|
||||||
|
NotificationConfiguration(
|
||||||
|
id="to-delete",
|
||||||
|
events=["s3:ObjectCreated:*"],
|
||||||
|
destination=WebhookDestination(url="http://example.com"),
|
||||||
|
),
|
||||||
|
]
|
||||||
|
notification_service.set_bucket_notifications("delete-bucket", configs)
|
||||||
|
assert len(notification_service.get_bucket_notifications("delete-bucket")) == 1
|
||||||
|
|
||||||
|
notification_service.delete_bucket_notifications("delete-bucket")
|
||||||
|
notification_service._configs.clear()
|
||||||
|
assert len(notification_service.get_bucket_notifications("delete-bucket")) == 0
|
||||||
|
|
||||||
|
def test_emit_event_no_config(self, notification_service):
|
||||||
|
event = NotificationEvent(
|
||||||
|
event_name="s3:ObjectCreated:Put",
|
||||||
|
bucket_name="no-config-bucket",
|
||||||
|
object_key="test.txt",
|
||||||
|
)
|
||||||
|
notification_service.emit_event(event)
|
||||||
|
assert notification_service._stats["events_queued"] == 0
|
||||||
|
|
||||||
|
def test_emit_event_matching_config(self, notification_service):
|
||||||
|
configs = [
|
||||||
|
NotificationConfiguration(
|
||||||
|
id="match-config",
|
||||||
|
events=["s3:ObjectCreated:*"],
|
||||||
|
destination=WebhookDestination(url="http://example.com/webhook"),
|
||||||
|
),
|
||||||
|
]
|
||||||
|
notification_service.set_bucket_notifications("event-bucket", configs)
|
||||||
|
|
||||||
|
event = NotificationEvent(
|
||||||
|
event_name="s3:ObjectCreated:Put",
|
||||||
|
bucket_name="event-bucket",
|
||||||
|
object_key="test.txt",
|
||||||
|
)
|
||||||
|
notification_service.emit_event(event)
|
||||||
|
assert notification_service._stats["events_queued"] == 1
|
||||||
|
|
||||||
|
def test_emit_event_non_matching_config(self, notification_service):
|
||||||
|
configs = [
|
||||||
|
NotificationConfiguration(
|
||||||
|
id="delete-only",
|
||||||
|
events=["s3:ObjectRemoved:*"],
|
||||||
|
destination=WebhookDestination(url="http://example.com/webhook"),
|
||||||
|
),
|
||||||
|
]
|
||||||
|
notification_service.set_bucket_notifications("delete-bucket", configs)
|
||||||
|
|
||||||
|
event = NotificationEvent(
|
||||||
|
event_name="s3:ObjectCreated:Put",
|
||||||
|
bucket_name="delete-bucket",
|
||||||
|
object_key="test.txt",
|
||||||
|
)
|
||||||
|
notification_service.emit_event(event)
|
||||||
|
assert notification_service._stats["events_queued"] == 0
|
||||||
|
|
||||||
|
def test_emit_object_created(self, notification_service):
|
||||||
|
configs = [
|
||||||
|
NotificationConfiguration(
|
||||||
|
id="create-config",
|
||||||
|
events=["s3:ObjectCreated:Put"],
|
||||||
|
destination=WebhookDestination(url="http://example.com/webhook"),
|
||||||
|
),
|
||||||
|
]
|
||||||
|
notification_service.set_bucket_notifications("create-bucket", configs)
|
||||||
|
|
||||||
|
notification_service.emit_object_created(
|
||||||
|
"create-bucket",
|
||||||
|
"new-file.txt",
|
||||||
|
size=1024,
|
||||||
|
etag="abc123",
|
||||||
|
operation="Put",
|
||||||
|
)
|
||||||
|
assert notification_service._stats["events_queued"] == 1
|
||||||
|
|
||||||
|
def test_emit_object_removed(self, notification_service):
|
||||||
|
configs = [
|
||||||
|
NotificationConfiguration(
|
||||||
|
id="remove-config",
|
||||||
|
events=["s3:ObjectRemoved:Delete"],
|
||||||
|
destination=WebhookDestination(url="http://example.com/webhook"),
|
||||||
|
),
|
||||||
|
]
|
||||||
|
notification_service.set_bucket_notifications("remove-bucket", configs)
|
||||||
|
|
||||||
|
notification_service.emit_object_removed(
|
||||||
|
"remove-bucket",
|
||||||
|
"deleted-file.txt",
|
||||||
|
operation="Delete",
|
||||||
|
)
|
||||||
|
assert notification_service._stats["events_queued"] == 1
|
||||||
|
|
||||||
|
def test_get_stats(self, notification_service):
|
||||||
|
stats = notification_service.get_stats()
|
||||||
|
assert "events_queued" in stats
|
||||||
|
assert "events_sent" in stats
|
||||||
|
assert "events_failed" in stats
|
||||||
|
|
||||||
|
@patch("app.notifications.requests.post")
|
||||||
|
def test_send_notification_success(self, mock_post, notification_service):
|
||||||
|
mock_response = MagicMock()
|
||||||
|
mock_response.status_code = 200
|
||||||
|
mock_post.return_value = mock_response
|
||||||
|
|
||||||
|
event = NotificationEvent(
|
||||||
|
event_name="s3:ObjectCreated:Put",
|
||||||
|
bucket_name="test-bucket",
|
||||||
|
object_key="test.txt",
|
||||||
|
)
|
||||||
|
destination = WebhookDestination(url="http://example.com/webhook")
|
||||||
|
|
||||||
|
notification_service._send_notification(event, destination)
|
||||||
|
mock_post.assert_called_once()
|
||||||
|
|
||||||
|
@patch("app.notifications.requests.post")
|
||||||
|
def test_send_notification_retry_on_failure(self, mock_post, notification_service):
|
||||||
|
mock_response = MagicMock()
|
||||||
|
mock_response.status_code = 500
|
||||||
|
mock_response.text = "Internal Server Error"
|
||||||
|
mock_post.return_value = mock_response
|
||||||
|
|
||||||
|
event = NotificationEvent(
|
||||||
|
event_name="s3:ObjectCreated:Put",
|
||||||
|
bucket_name="test-bucket",
|
||||||
|
object_key="test.txt",
|
||||||
|
)
|
||||||
|
destination = WebhookDestination(
|
||||||
|
url="http://example.com/webhook",
|
||||||
|
retry_count=2,
|
||||||
|
retry_delay_seconds=0,
|
||||||
|
)
|
||||||
|
|
||||||
|
with pytest.raises(RuntimeError) as exc_info:
|
||||||
|
notification_service._send_notification(event, destination)
|
||||||
|
assert "Failed after 2 attempts" in str(exc_info.value)
|
||||||
|
assert mock_post.call_count == 2
|
||||||
|
|
||||||
|
def test_notification_caching(self, notification_service):
|
||||||
|
configs = [
|
||||||
|
NotificationConfiguration(
|
||||||
|
id="cached-config",
|
||||||
|
events=["s3:ObjectCreated:*"],
|
||||||
|
destination=WebhookDestination(url="http://example.com"),
|
||||||
|
),
|
||||||
|
]
|
||||||
|
notification_service.set_bucket_notifications("cached-bucket", configs)
|
||||||
|
|
||||||
|
notification_service.get_bucket_notifications("cached-bucket")
|
||||||
|
assert "cached-bucket" in notification_service._configs
|
||||||
332
tests/test_object_lock.py
Normal file
332
tests/test_object_lock.py
Normal file
@@ -0,0 +1,332 @@
|
|||||||
|
import json
|
||||||
|
from datetime import datetime, timedelta, timezone
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from app.object_lock import (
|
||||||
|
ObjectLockConfig,
|
||||||
|
ObjectLockError,
|
||||||
|
ObjectLockRetention,
|
||||||
|
ObjectLockService,
|
||||||
|
RetentionMode,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class TestRetentionMode:
|
||||||
|
def test_governance_mode(self):
|
||||||
|
assert RetentionMode.GOVERNANCE.value == "GOVERNANCE"
|
||||||
|
|
||||||
|
def test_compliance_mode(self):
|
||||||
|
assert RetentionMode.COMPLIANCE.value == "COMPLIANCE"
|
||||||
|
|
||||||
|
|
||||||
|
class TestObjectLockRetention:
|
||||||
|
def test_to_dict(self):
|
||||||
|
retain_until = datetime(2025, 12, 31, 23, 59, 59, tzinfo=timezone.utc)
|
||||||
|
retention = ObjectLockRetention(
|
||||||
|
mode=RetentionMode.GOVERNANCE,
|
||||||
|
retain_until_date=retain_until,
|
||||||
|
)
|
||||||
|
result = retention.to_dict()
|
||||||
|
assert result["Mode"] == "GOVERNANCE"
|
||||||
|
assert "2025-12-31" in result["RetainUntilDate"]
|
||||||
|
|
||||||
|
def test_from_dict(self):
|
||||||
|
data = {
|
||||||
|
"Mode": "COMPLIANCE",
|
||||||
|
"RetainUntilDate": "2030-06-15T12:00:00+00:00",
|
||||||
|
}
|
||||||
|
retention = ObjectLockRetention.from_dict(data)
|
||||||
|
assert retention is not None
|
||||||
|
assert retention.mode == RetentionMode.COMPLIANCE
|
||||||
|
assert retention.retain_until_date.year == 2030
|
||||||
|
|
||||||
|
def test_from_dict_empty(self):
|
||||||
|
result = ObjectLockRetention.from_dict({})
|
||||||
|
assert result is None
|
||||||
|
|
||||||
|
def test_from_dict_missing_mode(self):
|
||||||
|
data = {"RetainUntilDate": "2030-06-15T12:00:00+00:00"}
|
||||||
|
result = ObjectLockRetention.from_dict(data)
|
||||||
|
assert result is None
|
||||||
|
|
||||||
|
def test_from_dict_missing_date(self):
|
||||||
|
data = {"Mode": "GOVERNANCE"}
|
||||||
|
result = ObjectLockRetention.from_dict(data)
|
||||||
|
assert result is None
|
||||||
|
|
||||||
|
def test_is_expired_future_date(self):
|
||||||
|
future = datetime.now(timezone.utc) + timedelta(days=30)
|
||||||
|
retention = ObjectLockRetention(
|
||||||
|
mode=RetentionMode.GOVERNANCE,
|
||||||
|
retain_until_date=future,
|
||||||
|
)
|
||||||
|
assert retention.is_expired() is False
|
||||||
|
|
||||||
|
def test_is_expired_past_date(self):
|
||||||
|
past = datetime.now(timezone.utc) - timedelta(days=30)
|
||||||
|
retention = ObjectLockRetention(
|
||||||
|
mode=RetentionMode.GOVERNANCE,
|
||||||
|
retain_until_date=past,
|
||||||
|
)
|
||||||
|
assert retention.is_expired() is True
|
||||||
|
|
||||||
|
|
||||||
|
class TestObjectLockConfig:
|
||||||
|
def test_to_dict_enabled(self):
|
||||||
|
config = ObjectLockConfig(enabled=True)
|
||||||
|
result = config.to_dict()
|
||||||
|
assert result["ObjectLockEnabled"] == "Enabled"
|
||||||
|
|
||||||
|
def test_to_dict_disabled(self):
|
||||||
|
config = ObjectLockConfig(enabled=False)
|
||||||
|
result = config.to_dict()
|
||||||
|
assert result["ObjectLockEnabled"] == "Disabled"
|
||||||
|
|
||||||
|
def test_from_dict_enabled(self):
|
||||||
|
data = {"ObjectLockEnabled": "Enabled"}
|
||||||
|
config = ObjectLockConfig.from_dict(data)
|
||||||
|
assert config.enabled is True
|
||||||
|
|
||||||
|
def test_from_dict_disabled(self):
|
||||||
|
data = {"ObjectLockEnabled": "Disabled"}
|
||||||
|
config = ObjectLockConfig.from_dict(data)
|
||||||
|
assert config.enabled is False
|
||||||
|
|
||||||
|
def test_from_dict_with_default_retention_days(self):
|
||||||
|
data = {
|
||||||
|
"ObjectLockEnabled": "Enabled",
|
||||||
|
"Rule": {
|
||||||
|
"DefaultRetention": {
|
||||||
|
"Mode": "GOVERNANCE",
|
||||||
|
"Days": 30,
|
||||||
|
}
|
||||||
|
},
|
||||||
|
}
|
||||||
|
config = ObjectLockConfig.from_dict(data)
|
||||||
|
assert config.enabled is True
|
||||||
|
assert config.default_retention is not None
|
||||||
|
assert config.default_retention.mode == RetentionMode.GOVERNANCE
|
||||||
|
|
||||||
|
def test_from_dict_with_default_retention_years(self):
|
||||||
|
data = {
|
||||||
|
"ObjectLockEnabled": "Enabled",
|
||||||
|
"Rule": {
|
||||||
|
"DefaultRetention": {
|
||||||
|
"Mode": "COMPLIANCE",
|
||||||
|
"Years": 1,
|
||||||
|
}
|
||||||
|
},
|
||||||
|
}
|
||||||
|
config = ObjectLockConfig.from_dict(data)
|
||||||
|
assert config.enabled is True
|
||||||
|
assert config.default_retention is not None
|
||||||
|
assert config.default_retention.mode == RetentionMode.COMPLIANCE
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def lock_service(tmp_path: Path):
|
||||||
|
return ObjectLockService(tmp_path)
|
||||||
|
|
||||||
|
|
||||||
|
class TestObjectLockService:
|
||||||
|
def test_get_bucket_lock_config_default(self, lock_service):
|
||||||
|
config = lock_service.get_bucket_lock_config("nonexistent-bucket")
|
||||||
|
assert config.enabled is False
|
||||||
|
assert config.default_retention is None
|
||||||
|
|
||||||
|
def test_set_and_get_bucket_lock_config(self, lock_service):
|
||||||
|
config = ObjectLockConfig(enabled=True)
|
||||||
|
lock_service.set_bucket_lock_config("my-bucket", config)
|
||||||
|
|
||||||
|
retrieved = lock_service.get_bucket_lock_config("my-bucket")
|
||||||
|
assert retrieved.enabled is True
|
||||||
|
|
||||||
|
def test_enable_bucket_lock(self, lock_service):
|
||||||
|
lock_service.enable_bucket_lock("lock-bucket")
|
||||||
|
|
||||||
|
config = lock_service.get_bucket_lock_config("lock-bucket")
|
||||||
|
assert config.enabled is True
|
||||||
|
|
||||||
|
def test_is_bucket_lock_enabled(self, lock_service):
|
||||||
|
assert lock_service.is_bucket_lock_enabled("new-bucket") is False
|
||||||
|
|
||||||
|
lock_service.enable_bucket_lock("new-bucket")
|
||||||
|
assert lock_service.is_bucket_lock_enabled("new-bucket") is True
|
||||||
|
|
||||||
|
def test_get_object_retention_not_set(self, lock_service):
|
||||||
|
result = lock_service.get_object_retention("bucket", "key.txt")
|
||||||
|
assert result is None
|
||||||
|
|
||||||
|
def test_set_and_get_object_retention(self, lock_service):
|
||||||
|
future = datetime.now(timezone.utc) + timedelta(days=30)
|
||||||
|
retention = ObjectLockRetention(
|
||||||
|
mode=RetentionMode.GOVERNANCE,
|
||||||
|
retain_until_date=future,
|
||||||
|
)
|
||||||
|
lock_service.set_object_retention("bucket", "key.txt", retention)
|
||||||
|
|
||||||
|
retrieved = lock_service.get_object_retention("bucket", "key.txt")
|
||||||
|
assert retrieved is not None
|
||||||
|
assert retrieved.mode == RetentionMode.GOVERNANCE
|
||||||
|
|
||||||
|
def test_cannot_modify_compliance_retention(self, lock_service):
|
||||||
|
future = datetime.now(timezone.utc) + timedelta(days=30)
|
||||||
|
retention = ObjectLockRetention(
|
||||||
|
mode=RetentionMode.COMPLIANCE,
|
||||||
|
retain_until_date=future,
|
||||||
|
)
|
||||||
|
lock_service.set_object_retention("bucket", "locked.txt", retention)
|
||||||
|
|
||||||
|
new_retention = ObjectLockRetention(
|
||||||
|
mode=RetentionMode.GOVERNANCE,
|
||||||
|
retain_until_date=future + timedelta(days=10),
|
||||||
|
)
|
||||||
|
with pytest.raises(ObjectLockError) as exc_info:
|
||||||
|
lock_service.set_object_retention("bucket", "locked.txt", new_retention)
|
||||||
|
assert "COMPLIANCE" in str(exc_info.value)
|
||||||
|
|
||||||
|
def test_cannot_modify_governance_without_bypass(self, lock_service):
|
||||||
|
future = datetime.now(timezone.utc) + timedelta(days=30)
|
||||||
|
retention = ObjectLockRetention(
|
||||||
|
mode=RetentionMode.GOVERNANCE,
|
||||||
|
retain_until_date=future,
|
||||||
|
)
|
||||||
|
lock_service.set_object_retention("bucket", "gov.txt", retention)
|
||||||
|
|
||||||
|
new_retention = ObjectLockRetention(
|
||||||
|
mode=RetentionMode.GOVERNANCE,
|
||||||
|
retain_until_date=future + timedelta(days=10),
|
||||||
|
)
|
||||||
|
with pytest.raises(ObjectLockError) as exc_info:
|
||||||
|
lock_service.set_object_retention("bucket", "gov.txt", new_retention)
|
||||||
|
assert "GOVERNANCE" in str(exc_info.value)
|
||||||
|
|
||||||
|
def test_can_modify_governance_with_bypass(self, lock_service):
|
||||||
|
future = datetime.now(timezone.utc) + timedelta(days=30)
|
||||||
|
retention = ObjectLockRetention(
|
||||||
|
mode=RetentionMode.GOVERNANCE,
|
||||||
|
retain_until_date=future,
|
||||||
|
)
|
||||||
|
lock_service.set_object_retention("bucket", "bypassable.txt", retention)
|
||||||
|
|
||||||
|
new_retention = ObjectLockRetention(
|
||||||
|
mode=RetentionMode.GOVERNANCE,
|
||||||
|
retain_until_date=future + timedelta(days=10),
|
||||||
|
)
|
||||||
|
lock_service.set_object_retention("bucket", "bypassable.txt", new_retention, bypass_governance=True)
|
||||||
|
retrieved = lock_service.get_object_retention("bucket", "bypassable.txt")
|
||||||
|
assert retrieved.retain_until_date > future
|
||||||
|
|
||||||
|
def test_can_modify_expired_retention(self, lock_service):
|
||||||
|
past = datetime.now(timezone.utc) - timedelta(days=30)
|
||||||
|
retention = ObjectLockRetention(
|
||||||
|
mode=RetentionMode.COMPLIANCE,
|
||||||
|
retain_until_date=past,
|
||||||
|
)
|
||||||
|
lock_service.set_object_retention("bucket", "expired.txt", retention)
|
||||||
|
|
||||||
|
future = datetime.now(timezone.utc) + timedelta(days=30)
|
||||||
|
new_retention = ObjectLockRetention(
|
||||||
|
mode=RetentionMode.GOVERNANCE,
|
||||||
|
retain_until_date=future,
|
||||||
|
)
|
||||||
|
lock_service.set_object_retention("bucket", "expired.txt", new_retention)
|
||||||
|
retrieved = lock_service.get_object_retention("bucket", "expired.txt")
|
||||||
|
assert retrieved.mode == RetentionMode.GOVERNANCE
|
||||||
|
|
||||||
|
def test_get_legal_hold_not_set(self, lock_service):
|
||||||
|
result = lock_service.get_legal_hold("bucket", "key.txt")
|
||||||
|
assert result is False
|
||||||
|
|
||||||
|
def test_set_and_get_legal_hold(self, lock_service):
|
||||||
|
lock_service.set_legal_hold("bucket", "held.txt", True)
|
||||||
|
assert lock_service.get_legal_hold("bucket", "held.txt") is True
|
||||||
|
|
||||||
|
lock_service.set_legal_hold("bucket", "held.txt", False)
|
||||||
|
assert lock_service.get_legal_hold("bucket", "held.txt") is False
|
||||||
|
|
||||||
|
def test_can_delete_object_no_lock(self, lock_service):
|
||||||
|
can_delete, reason = lock_service.can_delete_object("bucket", "unlocked.txt")
|
||||||
|
assert can_delete is True
|
||||||
|
assert reason == ""
|
||||||
|
|
||||||
|
def test_cannot_delete_object_with_legal_hold(self, lock_service):
|
||||||
|
lock_service.set_legal_hold("bucket", "held.txt", True)
|
||||||
|
|
||||||
|
can_delete, reason = lock_service.can_delete_object("bucket", "held.txt")
|
||||||
|
assert can_delete is False
|
||||||
|
assert "legal hold" in reason.lower()
|
||||||
|
|
||||||
|
def test_cannot_delete_object_with_compliance_retention(self, lock_service):
|
||||||
|
future = datetime.now(timezone.utc) + timedelta(days=30)
|
||||||
|
retention = ObjectLockRetention(
|
||||||
|
mode=RetentionMode.COMPLIANCE,
|
||||||
|
retain_until_date=future,
|
||||||
|
)
|
||||||
|
lock_service.set_object_retention("bucket", "compliant.txt", retention)
|
||||||
|
|
||||||
|
can_delete, reason = lock_service.can_delete_object("bucket", "compliant.txt")
|
||||||
|
assert can_delete is False
|
||||||
|
assert "COMPLIANCE" in reason
|
||||||
|
|
||||||
|
def test_cannot_delete_governance_without_bypass(self, lock_service):
|
||||||
|
future = datetime.now(timezone.utc) + timedelta(days=30)
|
||||||
|
retention = ObjectLockRetention(
|
||||||
|
mode=RetentionMode.GOVERNANCE,
|
||||||
|
retain_until_date=future,
|
||||||
|
)
|
||||||
|
lock_service.set_object_retention("bucket", "governed.txt", retention)
|
||||||
|
|
||||||
|
can_delete, reason = lock_service.can_delete_object("bucket", "governed.txt")
|
||||||
|
assert can_delete is False
|
||||||
|
assert "GOVERNANCE" in reason
|
||||||
|
|
||||||
|
def test_can_delete_governance_with_bypass(self, lock_service):
|
||||||
|
future = datetime.now(timezone.utc) + timedelta(days=30)
|
||||||
|
retention = ObjectLockRetention(
|
||||||
|
mode=RetentionMode.GOVERNANCE,
|
||||||
|
retain_until_date=future,
|
||||||
|
)
|
||||||
|
lock_service.set_object_retention("bucket", "governed.txt", retention)
|
||||||
|
|
||||||
|
can_delete, reason = lock_service.can_delete_object("bucket", "governed.txt", bypass_governance=True)
|
||||||
|
assert can_delete is True
|
||||||
|
assert reason == ""
|
||||||
|
|
||||||
|
def test_can_delete_expired_retention(self, lock_service):
|
||||||
|
past = datetime.now(timezone.utc) - timedelta(days=30)
|
||||||
|
retention = ObjectLockRetention(
|
||||||
|
mode=RetentionMode.COMPLIANCE,
|
||||||
|
retain_until_date=past,
|
||||||
|
)
|
||||||
|
lock_service.set_object_retention("bucket", "expired.txt", retention)
|
||||||
|
|
||||||
|
can_delete, reason = lock_service.can_delete_object("bucket", "expired.txt")
|
||||||
|
assert can_delete is True
|
||||||
|
|
||||||
|
def test_can_overwrite_is_same_as_delete(self, lock_service):
|
||||||
|
future = datetime.now(timezone.utc) + timedelta(days=30)
|
||||||
|
retention = ObjectLockRetention(
|
||||||
|
mode=RetentionMode.GOVERNANCE,
|
||||||
|
retain_until_date=future,
|
||||||
|
)
|
||||||
|
lock_service.set_object_retention("bucket", "overwrite.txt", retention)
|
||||||
|
|
||||||
|
can_overwrite, _ = lock_service.can_overwrite_object("bucket", "overwrite.txt")
|
||||||
|
can_delete, _ = lock_service.can_delete_object("bucket", "overwrite.txt")
|
||||||
|
assert can_overwrite == can_delete
|
||||||
|
|
||||||
|
def test_delete_object_lock_metadata(self, lock_service):
|
||||||
|
lock_service.set_legal_hold("bucket", "cleanup.txt", True)
|
||||||
|
lock_service.delete_object_lock_metadata("bucket", "cleanup.txt")
|
||||||
|
|
||||||
|
assert lock_service.get_legal_hold("bucket", "cleanup.txt") is False
|
||||||
|
|
||||||
|
def test_config_caching(self, lock_service):
|
||||||
|
config = ObjectLockConfig(enabled=True)
|
||||||
|
lock_service.set_bucket_lock_config("cached-bucket", config)
|
||||||
|
|
||||||
|
lock_service.get_bucket_lock_config("cached-bucket")
|
||||||
|
assert "cached-bucket" in lock_service._config_cache
|
||||||
287
tests/test_replication.py
Normal file
287
tests/test_replication.py
Normal file
@@ -0,0 +1,287 @@
|
|||||||
|
import json
|
||||||
|
import time
|
||||||
|
from pathlib import Path
|
||||||
|
from unittest.mock import MagicMock, patch
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from app.connections import ConnectionStore, RemoteConnection
|
||||||
|
from app.replication import (
|
||||||
|
ReplicationManager,
|
||||||
|
ReplicationRule,
|
||||||
|
ReplicationStats,
|
||||||
|
REPLICATION_MODE_ALL,
|
||||||
|
REPLICATION_MODE_NEW_ONLY,
|
||||||
|
_create_s3_client,
|
||||||
|
)
|
||||||
|
from app.storage import ObjectStorage
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def storage(tmp_path: Path):
|
||||||
|
storage_root = tmp_path / "data"
|
||||||
|
storage_root.mkdir(parents=True)
|
||||||
|
return ObjectStorage(storage_root)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def connections(tmp_path: Path):
|
||||||
|
connections_path = tmp_path / "connections.json"
|
||||||
|
store = ConnectionStore(connections_path)
|
||||||
|
conn = RemoteConnection(
|
||||||
|
id="test-conn",
|
||||||
|
name="Test Remote",
|
||||||
|
endpoint_url="http://localhost:9000",
|
||||||
|
access_key="remote-access",
|
||||||
|
secret_key="remote-secret",
|
||||||
|
region="us-east-1",
|
||||||
|
)
|
||||||
|
store.add(conn)
|
||||||
|
return store
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def replication_manager(storage, connections, tmp_path):
|
||||||
|
rules_path = tmp_path / "replication_rules.json"
|
||||||
|
storage_root = tmp_path / "data"
|
||||||
|
storage_root.mkdir(exist_ok=True)
|
||||||
|
manager = ReplicationManager(storage, connections, rules_path, storage_root)
|
||||||
|
yield manager
|
||||||
|
manager.shutdown(wait=False)
|
||||||
|
|
||||||
|
|
||||||
|
class TestReplicationStats:
|
||||||
|
def test_to_dict(self):
|
||||||
|
stats = ReplicationStats(
|
||||||
|
objects_synced=10,
|
||||||
|
objects_pending=5,
|
||||||
|
objects_orphaned=2,
|
||||||
|
bytes_synced=1024,
|
||||||
|
last_sync_at=1234567890.0,
|
||||||
|
last_sync_key="test/key.txt",
|
||||||
|
)
|
||||||
|
result = stats.to_dict()
|
||||||
|
assert result["objects_synced"] == 10
|
||||||
|
assert result["objects_pending"] == 5
|
||||||
|
assert result["objects_orphaned"] == 2
|
||||||
|
assert result["bytes_synced"] == 1024
|
||||||
|
assert result["last_sync_at"] == 1234567890.0
|
||||||
|
assert result["last_sync_key"] == "test/key.txt"
|
||||||
|
|
||||||
|
def test_from_dict(self):
|
||||||
|
data = {
|
||||||
|
"objects_synced": 15,
|
||||||
|
"objects_pending": 3,
|
||||||
|
"objects_orphaned": 1,
|
||||||
|
"bytes_synced": 2048,
|
||||||
|
"last_sync_at": 9876543210.0,
|
||||||
|
"last_sync_key": "another/key.txt",
|
||||||
|
}
|
||||||
|
stats = ReplicationStats.from_dict(data)
|
||||||
|
assert stats.objects_synced == 15
|
||||||
|
assert stats.objects_pending == 3
|
||||||
|
assert stats.objects_orphaned == 1
|
||||||
|
assert stats.bytes_synced == 2048
|
||||||
|
assert stats.last_sync_at == 9876543210.0
|
||||||
|
assert stats.last_sync_key == "another/key.txt"
|
||||||
|
|
||||||
|
def test_from_dict_with_defaults(self):
|
||||||
|
stats = ReplicationStats.from_dict({})
|
||||||
|
assert stats.objects_synced == 0
|
||||||
|
assert stats.objects_pending == 0
|
||||||
|
assert stats.objects_orphaned == 0
|
||||||
|
assert stats.bytes_synced == 0
|
||||||
|
assert stats.last_sync_at is None
|
||||||
|
assert stats.last_sync_key is None
|
||||||
|
|
||||||
|
|
||||||
|
class TestReplicationRule:
|
||||||
|
def test_to_dict(self):
|
||||||
|
rule = ReplicationRule(
|
||||||
|
bucket_name="source-bucket",
|
||||||
|
target_connection_id="test-conn",
|
||||||
|
target_bucket="dest-bucket",
|
||||||
|
enabled=True,
|
||||||
|
mode=REPLICATION_MODE_ALL,
|
||||||
|
created_at=1234567890.0,
|
||||||
|
)
|
||||||
|
result = rule.to_dict()
|
||||||
|
assert result["bucket_name"] == "source-bucket"
|
||||||
|
assert result["target_connection_id"] == "test-conn"
|
||||||
|
assert result["target_bucket"] == "dest-bucket"
|
||||||
|
assert result["enabled"] is True
|
||||||
|
assert result["mode"] == REPLICATION_MODE_ALL
|
||||||
|
assert result["created_at"] == 1234567890.0
|
||||||
|
assert "stats" in result
|
||||||
|
|
||||||
|
def test_from_dict(self):
|
||||||
|
data = {
|
||||||
|
"bucket_name": "my-bucket",
|
||||||
|
"target_connection_id": "conn-123",
|
||||||
|
"target_bucket": "remote-bucket",
|
||||||
|
"enabled": False,
|
||||||
|
"mode": REPLICATION_MODE_NEW_ONLY,
|
||||||
|
"created_at": 1111111111.0,
|
||||||
|
"stats": {"objects_synced": 5},
|
||||||
|
}
|
||||||
|
rule = ReplicationRule.from_dict(data)
|
||||||
|
assert rule.bucket_name == "my-bucket"
|
||||||
|
assert rule.target_connection_id == "conn-123"
|
||||||
|
assert rule.target_bucket == "remote-bucket"
|
||||||
|
assert rule.enabled is False
|
||||||
|
assert rule.mode == REPLICATION_MODE_NEW_ONLY
|
||||||
|
assert rule.created_at == 1111111111.0
|
||||||
|
assert rule.stats.objects_synced == 5
|
||||||
|
|
||||||
|
def test_from_dict_defaults_mode(self):
|
||||||
|
data = {
|
||||||
|
"bucket_name": "my-bucket",
|
||||||
|
"target_connection_id": "conn-123",
|
||||||
|
"target_bucket": "remote-bucket",
|
||||||
|
}
|
||||||
|
rule = ReplicationRule.from_dict(data)
|
||||||
|
assert rule.mode == REPLICATION_MODE_NEW_ONLY
|
||||||
|
assert rule.created_at is None
|
||||||
|
|
||||||
|
|
||||||
|
class TestReplicationManager:
|
||||||
|
def test_get_rule_not_exists(self, replication_manager):
|
||||||
|
rule = replication_manager.get_rule("nonexistent-bucket")
|
||||||
|
assert rule is None
|
||||||
|
|
||||||
|
def test_set_and_get_rule(self, replication_manager):
|
||||||
|
rule = ReplicationRule(
|
||||||
|
bucket_name="my-bucket",
|
||||||
|
target_connection_id="test-conn",
|
||||||
|
target_bucket="remote-bucket",
|
||||||
|
enabled=True,
|
||||||
|
mode=REPLICATION_MODE_NEW_ONLY,
|
||||||
|
created_at=time.time(),
|
||||||
|
)
|
||||||
|
replication_manager.set_rule(rule)
|
||||||
|
|
||||||
|
retrieved = replication_manager.get_rule("my-bucket")
|
||||||
|
assert retrieved is not None
|
||||||
|
assert retrieved.bucket_name == "my-bucket"
|
||||||
|
assert retrieved.target_connection_id == "test-conn"
|
||||||
|
assert retrieved.target_bucket == "remote-bucket"
|
||||||
|
|
||||||
|
def test_delete_rule(self, replication_manager):
|
||||||
|
rule = ReplicationRule(
|
||||||
|
bucket_name="to-delete",
|
||||||
|
target_connection_id="test-conn",
|
||||||
|
target_bucket="remote-bucket",
|
||||||
|
)
|
||||||
|
replication_manager.set_rule(rule)
|
||||||
|
assert replication_manager.get_rule("to-delete") is not None
|
||||||
|
|
||||||
|
replication_manager.delete_rule("to-delete")
|
||||||
|
assert replication_manager.get_rule("to-delete") is None
|
||||||
|
|
||||||
|
def test_save_and_reload_rules(self, replication_manager, tmp_path):
|
||||||
|
rule = ReplicationRule(
|
||||||
|
bucket_name="persistent-bucket",
|
||||||
|
target_connection_id="test-conn",
|
||||||
|
target_bucket="remote-bucket",
|
||||||
|
enabled=True,
|
||||||
|
)
|
||||||
|
replication_manager.set_rule(rule)
|
||||||
|
|
||||||
|
rules_path = tmp_path / "replication_rules.json"
|
||||||
|
assert rules_path.exists()
|
||||||
|
data = json.loads(rules_path.read_text())
|
||||||
|
assert "persistent-bucket" in data
|
||||||
|
|
||||||
|
@patch("app.replication._create_s3_client")
|
||||||
|
def test_check_endpoint_health_success(self, mock_create_client, replication_manager, connections):
|
||||||
|
mock_client = MagicMock()
|
||||||
|
mock_client.list_buckets.return_value = {"Buckets": []}
|
||||||
|
mock_create_client.return_value = mock_client
|
||||||
|
|
||||||
|
conn = connections.get("test-conn")
|
||||||
|
result = replication_manager.check_endpoint_health(conn)
|
||||||
|
assert result is True
|
||||||
|
mock_client.list_buckets.assert_called_once()
|
||||||
|
|
||||||
|
@patch("app.replication._create_s3_client")
|
||||||
|
def test_check_endpoint_health_failure(self, mock_create_client, replication_manager, connections):
|
||||||
|
mock_client = MagicMock()
|
||||||
|
mock_client.list_buckets.side_effect = Exception("Connection refused")
|
||||||
|
mock_create_client.return_value = mock_client
|
||||||
|
|
||||||
|
conn = connections.get("test-conn")
|
||||||
|
result = replication_manager.check_endpoint_health(conn)
|
||||||
|
assert result is False
|
||||||
|
|
||||||
|
def test_trigger_replication_no_rule(self, replication_manager):
|
||||||
|
replication_manager.trigger_replication("no-such-bucket", "test.txt", "write")
|
||||||
|
|
||||||
|
def test_trigger_replication_disabled_rule(self, replication_manager):
|
||||||
|
rule = ReplicationRule(
|
||||||
|
bucket_name="disabled-bucket",
|
||||||
|
target_connection_id="test-conn",
|
||||||
|
target_bucket="remote-bucket",
|
||||||
|
enabled=False,
|
||||||
|
)
|
||||||
|
replication_manager.set_rule(rule)
|
||||||
|
replication_manager.trigger_replication("disabled-bucket", "test.txt", "write")
|
||||||
|
|
||||||
|
def test_trigger_replication_missing_connection(self, replication_manager):
|
||||||
|
rule = ReplicationRule(
|
||||||
|
bucket_name="orphan-bucket",
|
||||||
|
target_connection_id="missing-conn",
|
||||||
|
target_bucket="remote-bucket",
|
||||||
|
enabled=True,
|
||||||
|
)
|
||||||
|
replication_manager.set_rule(rule)
|
||||||
|
replication_manager.trigger_replication("orphan-bucket", "test.txt", "write")
|
||||||
|
|
||||||
|
def test_replicate_task_path_traversal_blocked(self, replication_manager, connections):
|
||||||
|
rule = ReplicationRule(
|
||||||
|
bucket_name="secure-bucket",
|
||||||
|
target_connection_id="test-conn",
|
||||||
|
target_bucket="remote-bucket",
|
||||||
|
enabled=True,
|
||||||
|
)
|
||||||
|
replication_manager.set_rule(rule)
|
||||||
|
conn = connections.get("test-conn")
|
||||||
|
|
||||||
|
replication_manager._replicate_task("secure-bucket", "../../../etc/passwd", rule, conn, "write")
|
||||||
|
replication_manager._replicate_task("secure-bucket", "/root/secret", rule, conn, "write")
|
||||||
|
replication_manager._replicate_task("secure-bucket", "..\\..\\windows\\system32", rule, conn, "write")
|
||||||
|
|
||||||
|
|
||||||
|
class TestCreateS3Client:
|
||||||
|
@patch("app.replication.boto3.client")
|
||||||
|
def test_creates_client_with_correct_config(self, mock_boto_client):
|
||||||
|
conn = RemoteConnection(
|
||||||
|
id="test",
|
||||||
|
name="Test",
|
||||||
|
endpoint_url="http://localhost:9000",
|
||||||
|
access_key="access",
|
||||||
|
secret_key="secret",
|
||||||
|
region="eu-west-1",
|
||||||
|
)
|
||||||
|
_create_s3_client(conn)
|
||||||
|
|
||||||
|
mock_boto_client.assert_called_once()
|
||||||
|
call_kwargs = mock_boto_client.call_args[1]
|
||||||
|
assert call_kwargs["endpoint_url"] == "http://localhost:9000"
|
||||||
|
assert call_kwargs["aws_access_key_id"] == "access"
|
||||||
|
assert call_kwargs["aws_secret_access_key"] == "secret"
|
||||||
|
assert call_kwargs["region_name"] == "eu-west-1"
|
||||||
|
|
||||||
|
@patch("app.replication.boto3.client")
|
||||||
|
def test_health_check_mode_minimal_retries(self, mock_boto_client):
|
||||||
|
conn = RemoteConnection(
|
||||||
|
id="test",
|
||||||
|
name="Test",
|
||||||
|
endpoint_url="http://localhost:9000",
|
||||||
|
access_key="access",
|
||||||
|
secret_key="secret",
|
||||||
|
)
|
||||||
|
_create_s3_client(conn, health_check=True)
|
||||||
|
|
||||||
|
call_kwargs = mock_boto_client.call_args[1]
|
||||||
|
config = call_kwargs["config"]
|
||||||
|
assert config.retries["max_attempts"] == 1
|
||||||
@@ -184,5 +184,5 @@ class TestPaginatedObjectListing:
|
|||||||
assert resp.status_code == 200
|
assert resp.status_code == 200
|
||||||
|
|
||||||
html = resp.data.decode("utf-8")
|
html = resp.data.decode("utf-8")
|
||||||
# Should have the JavaScript loading infrastructure
|
# Should have the JavaScript loading infrastructure (external JS file)
|
||||||
assert "loadObjects" in html or "objectsApiUrl" in html
|
assert "bucket-detail-main.js" in html
|
||||||
|
|||||||
Reference in New Issue
Block a user