server-sdk-rust
server-sdk-rust copied to clipboard
chore(deps): bump h2 from 0.3.18 to 0.3.26
Bumps h2 from 0.3.18 to 0.3.26.
Release notes
Sourced from h2's releases.
v0.3.26
What's Changed
- Limit number of CONTINUATION frames for misbehaving connections.
See https://seanmonstar.com/blog/hyper-http2-continuation-flood/ for more info.
v0.3.25
What's Changed
- perf: optimize header list size calculations by
@Noah-Kennedyin hyperium/h2#750Full Changelog: https://github.com/hyperium/h2/compare/v0.3.24...v0.3.25
v0.3.24
Fixed
- Limit error resets for misbehaving connections.
v0.3.23
What's Changed
- cherry-pick fix: streams awaiting capacity lockout in hyperium/h2#734
v0.3.22
What's Changed
- Add
header_table_size(usize)option to client and server builders.- Improve throughput when vectored IO is not available.
- Update indexmap to 2.
New Contributors
@tottotomade their first contribution in hyperium/h2#714@xiaoyaweimade their first contribution in hyperium/h2#712@Protryonmade their first contribution in hyperium/h2#719@4JXmade their first contribution in hyperium/h2#638@vuittont60made their first contribution in hyperium/h2#724v0.3.21
What's Changed
- Fix opening of new streams over peer's max concurrent limit.
- Fix
RecvStreamto return data even if it has received aCANCELstream error.- Update MSRV to 1.63.
New Contributors
@DDtKeymade their first contribution in hyperium/h2#703@jwilmmade their first contribution in hyperium/h2#707v0.3.20
Bug Fixes
... (truncated)
Changelog
Sourced from h2's changelog.
0.3.26 (April 3, 2024)
- Limit number of CONTINUATION frames for misbehaving connections.
0.3.25 (March 15, 2024)
- Improve performance decoding many headers.
0.3.24 (January 17, 2024)
- Limit error resets for misbehaving connections.
0.3.23 (January 10, 2024)
- Backport fix from 0.4.1 for stream capacity assignment.
0.3.22 (November 15, 2023)
- Add
header_table_size(usize)option to client and server builders.- Improve throughput when vectored IO is not available.
- Update indexmap to 2.
0.3.21 (August 21, 2023)
- Fix opening of new streams over peer's max concurrent limit.
- Fix
RecvStreamto return data even if it has received aCANCELstream error.- Update MSRV to 1.63.
0.3.20 (June 26, 2023)
- Fix panic if a server received a request with a
:statuspseudo header in the 1xx range.- Fix panic if a reset stream had pending push promises that were more than allowed.
- Fix potential flow control overflow by subtraction, instead returning a connection error.
0.3.19 (May 12, 2023)
- Fix counting reset streams when triggered by a GOAWAY.
- Send
too_many_resetsin opaque debug data of GOAWAY when too many resets received.
Commits
357127ev0.3.261a357aafix: limit number of CONTINUATION frames allowed5b6c9e0refactor: cleanup new unused warnings (#757)3a79832v0.3.2594e80b1perf: optimize header list size calculations (#750)7243ab5Prepare v0.3.24d919cd6streams: limit error resets for misbehaving connectionsa7eb14av0.3.23b668c7ffix: streams awaiting capacity lockout (#730) (#734)0f412d8v0.3.22- Additional commits viewable in compare view
You can trigger a rebase of this PR by commenting @dependabot rebase.
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebasewill rebase this PR@dependabot recreatewill recreate this PR, overwriting any edits that have been made to it@dependabot mergewill merge this PR after your CI passes on it@dependabot squash and mergewill squash and merge this PR after your CI passes on it@dependabot cancel mergewill cancel a previously requested merge and block automerging@dependabot reopenwill reopen this PR if it is closed@dependabot closewill close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually@dependabot show <dependency name> ignore conditionswill show all of the ignore conditions of the specified dependency@dependabot ignore this major versionwill close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this minor versionwill close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependencywill close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) You can disable automated security fix PRs for this repo from the Security Alerts page.
Note Automatic rebases have been disabled on this pull request as it has been open for over 30 days.
`` import pyodbc import pandas as pd
class DBConnection: def init(self): self.conn = None
def __enter__(self):
try:
# Initialize the connection
print("here connection is getting initialized--->")
self.conn = pyodbc.connect('connection details',
autocommit=False)
return self
except Exception as e:
print(f"Error occurred while establishing connection: {e}")
raise
def __exit__(self, exc_type, exc_value, traceback):
try:
if self.conn:
if exc_type is not None:
# Rollback transaction if exception occurred
print(f"exception received here--->{exc_type}")
self.conn.rollback()
else:
print("Coming for commit--->")
# Commit transaction if no exception occurred
self.conn.commit()
# Reset autocommit to default value
self.conn.close()
except Exception as e:
print(f"Error occurred while closing connection: {e}")
def select_data(self, query):
"""
This will select the data based on the query provided
:param query: query to be executed
:return: DataFrame
"""
try:
# Creating cursor without context management
connection_cursor = self.conn.cursor()
connection_cursor.execute(query)
rows = connection_cursor.fetchall()
if rows:
columns = [str(column[0]) for column in connection_cursor.description]
dataframe = pd.DataFrame.from_records(rows, columns=columns)
return dataframe
else:
return pd.DataFrame()
except Exception as e:
print(f"Error occurred while executing select query: {e}")
raise e
def bulk_update(self, table, updates):
"""
Perform bulk update operations on the specified table.
Args:
table (str): The name of the table where the updates will be applied.
updates (list): A list of dictionaries representing the updates to be performed.
Each dictionary should contain 'columns', 'values', and 'condition'.
Example:
[{'columns': ['col1', 'col2'], 'values': ['val1', 'val2'], 'condition': 'col3 = ?'}]
This will update 'col1' and 'col2' with 'val1' and 'val2' where 'col3' meets the condition.
cursor: connection.Cursor object
Raises:
Exception: If an error occurs during the update operation.
"""
try:
# Create cursor without context manager
connection_cursor = self.conn.cursor()
for update in updates:
update_query = f"UPDATE {table} SET {', '.join([f'{col} = ?' for col in update['columns']])} WHERE {update['condition']}"
connection_cursor.execute(update_query, update['values'])
except Exception as e:
print(f"Error occurred while bulk updating data: {e}")
raise e
def single_insert(self, table, data):
"""
Perform a single insert operation for a row of data into the specified table.
Args:
table (str): The name of the table where the data will be inserted.
data (dict): A dictionary representing the data to be inserted.
The keys should correspond to column names and values
should correspond to the values to be inserted.
Raises:
Exception: If an error occurs during the insert operation.
"""
try:
connection_cursor = self.conn.cursor()
columns = ', '.join(data.keys())
placeholders = ', '.join(['?' for _ in data.values()])
insert_query = f"INSERT INTO {table} ({columns}) VALUES ({placeholders})"
connection_cursor.execute(insert_query, list(data.values()))
except Exception as e:
print(f"Error occurred while performing single insert: {e}")
raise e
def bulk_insert(self, table, df):
"""
Perform bulk insert operations using a DataFrame into the specified table.
Args:
table (str): The name of the table where the data will be inserted.
df (pd.DataFrame): DataFrame containing the data to be inserted.
Raises:
Exception: If an error occurs during the insert operation.
"""
try:
connection_cursor = self.conn.cursor()
# Extract column names and values from the DataFrame
columns = list(df.columns)
values = df.values.tolist()
# Generate the INSERT INTO query
insert_query = f"INSERT INTO {table} ({', '.join(columns)}) VALUES ({', '.join(['?'] * len(columns))})"
# Execute the query
connection_cursor.executemany(insert_query, values)
except Exception as e:
print(f"Error occurred while bulk inserting data: {e}")
raise e
df = pd.DataFrame({ 'mlc_component': ['1_bj'], 'mlc_section': ['1_bj'], 'mlc_key': ['1_bj'], 'mlc_value': ['1_bj'] })
data_to_insert = { 'mlc_component': '3_bj', 'mlc_section': '3_bj', 'mlc_key': '3_bj', 'mlc_value': '3_bj' } data1_to_insert = { 'mlc_component': '2_bj', 'mlc_section': '2_bj', 'mlc_key': '2_bj', 'mlc_value': '2_bj' } with DBConnection() as db_conn: try: # db_conn.bulk_insert("data.data_configuration", df) # cursor.execute("BEGIN TRANSACTION") # Start transaction db_conn.bulk_insert('data.data_configuration', df) db_conn.single_insert('data.data_configuration', data_to_insert) db_conn.single_insert('data.data_configuration', data1_to_insert) # db_conn.bulk_insert("data_configuration", df) except Exception as e: db_conn.conn.rollback() # Rollback on any exception raise e
Above code is working as per my expectation. Following is the link for your reference :
https://github.com/mkleehammer/pyodbc/wiki/Cursor#context-manager
I feel in mage-ai code for every method cursor is used with context manager due to which above erroneous behavior is observed.
Snap of the code from above mentioned link which will prove my point:
**
**
For additional information you can go through above mentioned link. Hope it helps !!!
@wangxiaoyou1993 kindly assign someone to resolve this issue.
@wangxiaoyou1993 Please let me know if you have any update on above issue