no history

This commit is contained in:
yumoqing 2024-09-23 14:20:23 +08:00
commit 6692ee8ed2
77 changed files with 6666 additions and 0 deletions

285
README.md Executable file
View File

@ -0,0 +1,285 @@
# ahserver
ahserver is a http(s) server base on aiohttp asynchronous framework.
ahserver capabilities:
* user authorization and authentication support
* https support
* processor for registed file type
* pre-defined variables and function can be called by processors
* multiple database connection and connection pool
* a easy way to wrap SQL
* configure data from json file stored at ./conf/config.json
* upload file auto save under config.filesroot folder
* i18n support
* processors include:
+ 'dspy' file subffix by '.dspy', is process as a python script
+ 'tmpl' files subffix by '.tmpl', is process as a template
+ 'md' files subffix by '.md', is process as a markdown file
+ 'xlsxds' files subffix by '.xlsxds' is process as a data source from xlsx file
+ 'sqlds' files subffixed by '.sqlds' is process as a data source from database via a sql command
## Requirements
see requirements.txt
[pyutils](https://github.com/yumoqing/pyutils)
[sqlor](https://github.com/yumoqing/sqlor)
## How to use
see ah.py
```
from ahserver.configuredServer import ConfiguredServer
if __name__ == '__main__':
server = ConfiguredServer()
server.run()
```
## Folder structure
+ app
+ |-ah.py
+ |--ahserver
+ |-conf
+ |-config.json
+ |-i18n
## Configuration file content
ahserver using json file format in its configuration, the following is a sample:
```
{
"databases":{
"aiocfae":{
"driver":"aiomysql",
"async_mode":true,
"coding":"utf8",
"dbname":"cfae",
"kwargs":{
"user":"test",
"db":"cfae",
"password":"test123",
"host":"localhost"
}
},
"cfae":{
"driver":"mysql.connector",
"coding":"utf8",
"dbname":"cfae",
"kwargs":{
"user":"test",
"db":"cfae",
"password":"test123",
"host":"localhost"
}
}
},
"website":{
"paths":[
["$[workdir]$/../usedpkgs/antd","/antd"],
["$[workdir]$/../wolon",""]
],
"host":"0.0.0.0",
"port":8080,
"coding":"utf-8",
"ssl":{
"crtfile":"$[workdir]$/conf/www.xxx.com.pem",
"keyfile":"$[workdir]$/conf/www.xxx.com.key"
},
"indexes":[
"index.html",
"index.tmpl",
"index.dspy",
"index.md"
],
"visualcoding":{
"default_root":"/samples/vc/test",
"userroot":{
"ymq":"/samples/vc/ymq",
"root":"/samples/vc/root"
},
"jrjpath":"/samples/vc/default"
},
"processors":[
[".xlsxds","xlsxds"],
[".sqlds","sqlds"],
[".tmpl.js","tmpl"],
[".tmpl.css","tmpl"],
[".html.tmpl","tmpl"],
[".tmpl","tmpl"],
[".dspy","dspy"],
[".md","md"]
]
},
"langMapping":{
"zh-Hans-CN":"zh-cn",
"zh-CN":"zh-cn",
"en-us":"en",
"en-US":"en"
}
}
```
### database configuration
the ahserver using packages for database engines are:
* oracle:cx_Oracle
* mysql:mysql-connector
* postgresql:psycopg2
* sql server:pymssql
however, you can change it, but must change the "driver" value the the package name in the database connection definition.
in the databases section in config.json, you can define one or more database connection, and also, it support many database engine, just as ORACLE,mysql,postgreSQL.
define a database connnect you need follow the following json format.
* mysql or mariadb
```
"metadb":{
"driver":"mysql.connector",
"coding":"utf8",
"dbname":"sampledb",
"kwargs":{
"user":"user1",
"db":"sampledb",
"password":"user123",
"host":"localhost"
}
}
```
the dbname and "db" should the same, which is the database name in mysql database
* Oracle
```
"db_ora":{
"driver":"cx_Oracle",
"coding":"utf8",
"dbname":sampledb",
"kwargs":{
"user":"user1",
"host":"localhost",
"dsn":"10.0.185.137:1521/SAMPLEDB"
}
}
```
* SQL Server
```
"db_mssql":{
"driver":"pymssql",
"coding":"utf8",
"dbname":"sampledb",
"kwargs":{
"user":"user1",
"database":"sampledb",
"password":"user123",
"server":"localhost",
"port":1433,
"charset":"utf8"
}
}
```
* PostgreSQL
```
"db_pg":{
"driver":"psycopg2",
"dbname":"testdb",
"coding":"utf8",
"kwargs":{
"database":"testdb",
"user":"postgres",
"password":"pass123",
"host":"127.0.0.1",
"port":"5432"
}
}
```
### https support
In config.json file, config.website.ssl need to set(see above)
### website configuration
#### paths
ahserver can serve its contents (static file, dynamic contents render by its processors) resided on difference folders on the server file system.
ahserver finds a content identified by http url in order the of the paths specified by "paths" lists inside "website" definition of config.json file
#### processors
all the prcessors ahserver using, must be listed here.
#### host
by defaualt, '0.0.0.0'
#### port
by default, 8080
#### coding
ahserver recomments using 'utf-8'
### langMapping
the browsers will send 'Accept-Language' are difference even if the same language. so ahserver using a "langMapping" definition to mapping multiple browser lang to same i18n file
## international
ahserver using MiniI18N in appPublic modules in pyutils package to implements i18n support
it will search translate text in ms* txt file in folder named by language name inside i18n folder in workdir folder, workdir is the folder where the ahserver program resided or identified by command line paraments.
## performance
To be list here
## Behind the nginx
when ahserver running behind the nginx, nginx should be forward following header to ahserver
* X-Forwarded-For: client real ip
* X-Forwarded-Scheme: scheme in client browser
* X-Forwarded-Host: host in client browser
* X-Forwarded-Url: url in client browser
## environment for processors
When coding in processors, ahserver provide some environment stuff for build apllication, there are modules, functions, classes and variables
### modules:
* time
* datetime
* random
* json
### functions:
* configValue
* isNone
* int
* str
* float
* type
* str2date
* str2datetime
* curDatetime
* uuid
* runSQL
* runSQLPaging
* runSQLIterator
* runSQLResultFields
* getTables
* getTableFields
* getTablePrimaryKey
* getTableForignKeys
* folderInfo
* abspath
* request2ns
* CRUD
* data2xlsx
* xlsxdata
* openfile
* i18n
* i18nDict
* absurl
* abspath
* request2ns
### variables
* resource
* terminalType
### classes
* ArgsConvert

22
ah.py Executable file
View File

@ -0,0 +1,22 @@
from ahserver.configuredServer import ConfiguredServer
from ahserver.auth_api import AuthAPI
"""
need to implement your AuthAPI
class MyAuthAPI:
def needAuth(self,path):
return Fasle # do not need authentication
return True # need authentication
async def getPermissionNeed(self,path):
return 'admin'
async def checkUserPassword(self,user_id,password):
return True
async def getUserPermissions(self,user):
return ['admin','view']
"""
if __name__ == '__main__':
server = ConfiguredServer(AuthAPI)
server.run()

297
ahserver.egg-info/PKG-INFO Executable file
View File

@ -0,0 +1,297 @@
Metadata-Version: 2.1
Name: ahserver
Version: 0.3.4
Summary: ahserver
Home-page: https://github.com/yumoqing/ahserver
Author: yumoqing
Author-email: yumoqing@gmail.com
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: License :: OSI Approved :: MIT License
Description-Content-Type: text/markdown
# ahserver
ahserver is a http(s) server base on aiohttp asynchronous framework.
ahserver capabilities:
* user authorization and authentication support
* https support
* processor for registed file type
* pre-defined variables and function can be called by processors
* multiple database connection and connection pool
* a easy way to wrap SQL
* configure data from json file stored at ./conf/config.json
* upload file auto save under config.filesroot folder
* i18n support
* processors include:
+ 'dspy' file subffix by '.dspy', is process as a python script
+ 'tmpl' files subffix by '.tmpl', is process as a template
+ 'md' files subffix by '.md', is process as a markdown file
+ 'xlsxds' files subffix by '.xlsxds' is process as a data source from xlsx file
+ 'sqlds' files subffixed by '.sqlds' is process as a data source from database via a sql command
## Requirements
see requirements.txt
[pyutils](https://github.com/yumoqing/pyutils)
[sqlor](https://github.com/yumoqing/sqlor)
## How to use
see ah.py
```
from ahserver.configuredServer import ConfiguredServer
if __name__ == '__main__':
server = ConfiguredServer()
server.run()
```
## Folder structure
+ app
+ |-ah.py
+ |--ahserver
+ |-conf
+ |-config.json
+ |-i18n
## Configuration file content
ahserver using json file format in its configuration, the following is a sample:
```
{
"databases":{
"aiocfae":{
"driver":"aiomysql",
"async_mode":true,
"coding":"utf8",
"dbname":"cfae",
"kwargs":{
"user":"test",
"db":"cfae",
"password":"test123",
"host":"localhost"
}
},
"cfae":{
"driver":"mysql.connector",
"coding":"utf8",
"dbname":"cfae",
"kwargs":{
"user":"test",
"db":"cfae",
"password":"test123",
"host":"localhost"
}
}
},
"website":{
"paths":[
["$[workdir]$/../usedpkgs/antd","/antd"],
["$[workdir]$/../wolon",""]
],
"host":"0.0.0.0",
"port":8080,
"coding":"utf-8",
"ssl":{
"crtfile":"$[workdir]$/conf/www.xxx.com.pem",
"keyfile":"$[workdir]$/conf/www.xxx.com.key"
},
"indexes":[
"index.html",
"index.tmpl",
"index.dspy",
"index.md"
],
"visualcoding":{
"default_root":"/samples/vc/test",
"userroot":{
"ymq":"/samples/vc/ymq",
"root":"/samples/vc/root"
},
"jrjpath":"/samples/vc/default"
},
"processors":[
[".xlsxds","xlsxds"],
[".sqlds","sqlds"],
[".tmpl.js","tmpl"],
[".tmpl.css","tmpl"],
[".html.tmpl","tmpl"],
[".tmpl","tmpl"],
[".dspy","dspy"],
[".md","md"]
]
},
"langMapping":{
"zh-Hans-CN":"zh-cn",
"zh-CN":"zh-cn",
"en-us":"en",
"en-US":"en"
}
}
```
### database configuration
the ahserver using packages for database engines are:
* oracle:cx_Oracle
* mysql:mysql-connector
* postgresql:psycopg2
* sql server:pymssql
however, you can change it, but must change the "driver" value the the package name in the database connection definition.
in the databases section in config.json, you can define one or more database connection, and also, it support many database engine, just as ORACLE,mysql,postgreSQL.
define a database connnect you need follow the following json format.
* mysql or mariadb
```
"metadb":{
"driver":"mysql.connector",
"coding":"utf8",
"dbname":"sampledb",
"kwargs":{
"user":"user1",
"db":"sampledb",
"password":"user123",
"host":"localhost"
}
}
```
the dbname and "db" should the same, which is the database name in mysql database
* Oracle
```
"db_ora":{
"driver":"cx_Oracle",
"coding":"utf8",
"dbname":sampledb",
"kwargs":{
"user":"user1",
"host":"localhost",
"dsn":"10.0.185.137:1521/SAMPLEDB"
}
}
```
* SQL Server
```
"db_mssql":{
"driver":"pymssql",
"coding":"utf8",
"dbname":"sampledb",
"kwargs":{
"user":"user1",
"database":"sampledb",
"password":"user123",
"server":"localhost",
"port":1433,
"charset":"utf8"
}
}
```
* PostgreSQL
```
"db_pg":{
"driver":"psycopg2",
"dbname":"testdb",
"coding":"utf8",
"kwargs":{
"database":"testdb",
"user":"postgres",
"password":"pass123",
"host":"127.0.0.1",
"port":"5432"
}
}
```
### https support
In config.json file, config.website.ssl need to set(see above)
### website configuration
#### paths
ahserver can serve its contents (static file, dynamic contents render by its processors) resided on difference folders on the server file system.
ahserver finds a content identified by http url in order the of the paths specified by "paths" lists inside "website" definition of config.json file
#### processors
all the prcessors ahserver using, must be listed here.
#### host
by defaualt, '0.0.0.0'
#### port
by default, 8080
#### coding
ahserver recomments using 'utf-8'
### langMapping
the browsers will send 'Accept-Language' are difference even if the same language. so ahserver using a "langMapping" definition to mapping multiple browser lang to same i18n file
## international
ahserver using MiniI18N in appPublic modules in pyutils package to implements i18n support
it will search translate text in ms* txt file in folder named by language name inside i18n folder in workdir folder, workdir is the folder where the ahserver program resided or identified by command line paraments.
## performance
To be list here
## Behind the nginx
when ahserver running behind the nginx, nginx should be forward following header to ahserver
* X-Forwarded-For: client real ip
* X-Forwarded-Scheme: scheme in client browser
* X-Forwarded-Host: host in client browser
* X-Forwarded-Url: url in client browser
## environment for processors
When coding in processors, ahserver provide some environment stuff for build apllication, there are modules, functions, classes and variables
### modules:
* time
* datetime
* random
* json
### functions:
* configValue
* isNone
* int
* str
* float
* type
* str2date
* str2datetime
* curDatetime
* uuid
* runSQL
* runSQLPaging
* runSQLIterator
* runSQLResultFields
* getTables
* getTableFields
* getTablePrimaryKey
* getTableForignKeys
* folderInfo
* abspath
* request2ns
* CRUD
* data2xlsx
* xlsxdata
* openfile
* i18n
* i18nDict
* absurl
* abspath
* request2ns
### variables
* resource
* terminalType
### classes
* ArgsConvert

36
ahserver.egg-info/SOURCES.txt Executable file
View File

@ -0,0 +1,36 @@
README.md
setup.py
ahserver/__init__.py
ahserver/auth_api.py
ahserver/baseProcessor.py
ahserver/configuredServer.py
ahserver/dbadmin.py
ahserver/dsProcessor.py
ahserver/error.py
ahserver/filedownload.py
ahserver/filestorage.py
ahserver/filetest.py
ahserver/functionProcessor.py
ahserver/globalEnv.py
ahserver/llmProcessor.py
ahserver/llm_client.py
ahserver/loadplugins.py
ahserver/myTE.py
ahserver/p2p_middleware.py
ahserver/processorResource.py
ahserver/proxyProcessor.py
ahserver/restful.py
ahserver/serverenv.py
ahserver/sqldsProcessor.py
ahserver/uriop.py
ahserver/url2file.py
ahserver/utils.py
ahserver/version.py
ahserver/websocketProcessor.py
ahserver/xlsxData.py
ahserver/xlsxdsProcessor.py
ahserver.egg-info/PKG-INFO
ahserver.egg-info/SOURCES.txt
ahserver.egg-info/dependency_links.txt
ahserver.egg-info/requires.txt
ahserver.egg-info/top_level.txt

View File

@ -0,0 +1 @@

View File

@ -0,0 +1,17 @@
asyncio
aiofiles
aiodns
cchardet
aiohttp
aiohttp_session
aiohttp_auth_autz
aiohttp-cors
aiomysql
aioredis
psycopg2-binary
aiopg
jinja2
ujson
openpyxl
pillow
py-natpmp

View File

@ -0,0 +1 @@
ahserver

0
ahserver/__init__.py Executable file
View File

Binary file not shown.

Binary file not shown.

166
ahserver/auth_api.py Executable file
View File

@ -0,0 +1,166 @@
import time
import uuid
from traceback import print_exc
from aiohttp_auth import auth
from aiohttp_auth.auth.ticket_auth import TktAuthentication
from aiohttp_session.redis_storage import RedisStorage
from os import urandom
from aiohttp import web
import aiohttp_session
import aioredis
import base64
import binascii
from aiohttp_session import get_session, session_middleware, Session
from aiohttp_session.cookie_storage import EncryptedCookieStorage
from aiohttp_session.redis_storage import RedisStorage
from appPublic.jsonConfig import getConfig
from appPublic.rsawrap import RSA
from appPublic.log import info, debug, warning, error, critical, exception
def get_client_ip(obj, request):
ip = request.headers.get('X-Forwarded-For')
if not ip:
ip = request.remote
request['client_ip'] = ip
return ip
async def get_session_user(request):
userid = await auth.get_auth(request)
return userid
async def user_login(request, userid):
await auth.remember(request, userid)
async def user_logout(request):
await auth.forget(request)
class MyRedisStorage(RedisStorage):
def key_gen(self, request):
key = request.headers.get('client_uuid')
if not key:
key = uuid.uuid4().hex
return key
if isinstance(key, str):
key = key.encode('utf-8')
key = binascii.hexlify(key)
key = key.decode('utf-8')
return key
async def save_session(self, request: web.Request,
response: web.StreamResponse,
session: Session) -> None:
key = session.identity
if key is None:
key = self.key_gen(request)
self.save_cookie(response, key, max_age=session.max_age)
else:
if session.empty:
self.save_cookie(response, "", max_age=session.max_age)
else:
key = str(key)
self.save_cookie(response, key, max_age=session.max_age)
data_str = self._encoder(self._get_session_data(session))
await self._redis.set(
self.cookie_name + "_" + key,
data_str,
ex=session.max_age,
)
class AuthAPI:
def __init__(self):
self.conf = getConfig()
async def checkUserPermission(self, user, path):
# print('************* checkUserPermission() use default one ****************')
return True
def getPrivateKey(self):
if not hasattr(self,'rsaEngine'):
self.rsaEngine = RSA()
fname = self.conf.website.rsakey.privatekey
self.privatekey = self.rsaEngine.read_privatekey(fname)
return self.privatekey
def rsaDecode(self,cdata):
self.getPrivateKey()
return self.rsaEngine.decode(self.privatekey,cdata)
async def setupAuth(self,app):
# setup session middleware in aiohttp fashion
b = str(self.conf.website.port).encode('utf-8')
cnt = 32 - len(b)
secret = b + b'iqwertyuiopasdfghjklzxcvbnm12345'[:cnt]
storage = EncryptedCookieStorage(secret)
if self.conf.website.session_redis:
url = self.conf.website.session_redis.url
# redis = await aioredis.from_url("redis://127.0.0.1:6379")
redis = await aioredis.from_url(url)
storage = MyRedisStorage(redis)
aiohttp_session.setup(app, storage)
# Create an auth ticket mechanism that expires after 1 minute (60
# seconds), and has a randomly generated secret. Also includes the
# optional inclusion of the users IP address in the hash
session_max_time = 120
session_reissue_time = 30
if self.conf.website.session_max_time:
session_max_time = self.conf.website.session_max_time
if self.conf.website.session_reissue_time:
session_reissue_time = self.conf.website.session_reissue_time
def _new_ticket(self, request, user_id):
client_uuid = request.headers.get('client_uuid')
ip = self._get_ip(request)
valid_until = int(time.time()) + self._max_age
# print(f'hack: my _new_ticket() called ... remote {ip=}, {client_uuid=}')
return self._ticket.new(user_id,
valid_until=valid_until,
client_ip=ip,
user_data=client_uuid)
TktAuthentication._get_ip = get_client_ip
TktAuthentication._new_ticket = _new_ticket
policy = auth.SessionTktAuthentication(secret,
session_max_time,
reissue_time=session_reissue_time,
include_ip=True)
# setup aiohttp_auth.auth middleware in aiohttp fashion
# print('policy = ', policy)
auth.setup(app, policy)
app.middlewares.append(self.checkAuth)
@web.middleware
async def checkAuth(self,request,handler):
info(f'checkAuth() called ... {request.path=}')
t1 = time.time()
path = request.path
user = await auth.get_auth(request)
is_ok = await self.checkUserPermission(user, path)
t2 = time.time()
ip = get_client_ip(None, request)
if is_ok:
try:
ret = await handler(request)
t3 = time.time()
info(f'timecost=client({ip}) {user} access {path} cost {t3-t1}, ({t2-t1})')
return ret
except Exception as e:
t3 = time.time()
info(f'Exception=client({ip}) {user} access {path} cost {t3-t1}, ({t2-t1}), except={e}')
print(f'Exception=client({ip}) {user} access {path} cost {t3-t1}, ({t2-t1}), except={e}')
print_exc()
raise e
if user is None:
info(f'timecost=client({ip}) {user} access need login to access {path} ({t2-t1})')
raise web.HTTPUnauthorized
info(f'timecost=client({ip}) {user} access {path} forbidden ({t2-t1})')
raise web.HTTPForbidden()
async def needAuth(self,path):
return False

243
ahserver/baseProcessor.py Executable file
View File

@ -0,0 +1,243 @@
import os
import re
import json
import codecs
import aiofiles
from aiohttp.web_request import Request
from aiohttp.web_response import Response, StreamResponse
from appPublic.jsonConfig import getConfig
from appPublic.dictObject import DictObject
from appPublic.folderUtils import listFile
from appPublic.log import info, debug, warning, error, critical, exception
from .utils import unicode_escape
from .serverenv import ServerEnv
from .filetest import current_fileno
class ObjectCache:
def __init__(self):
self.cache = {}
def store(self,path,obj):
o = self.cache.get(path,None)
if o is not None:
try:
del o.cached_obj
except:
pass
o = DictObject()
o.cached_obj = obj
o.mtime = os.path.getmtime(path)
self.cache[path] = o
def get(self,path):
o = self.cache.get(path)
if o:
if os.path.getmtime(path) > o.mtime:
return None
return o.cached_obj
return None
class BaseProcessor:
@classmethod
def isMe(self,name):
return name=='base'
def __init__(self,path,resource):
self.env_set = False
self.path = path
self.resource = resource
self.retResponse = None
# self.last_modified = os.path.getmtime(path)
# self.content_length = os.path.getsize(path)
self.headers = {
'Content-Type': 'text/html; utf-8',
'Accept-Ranges': 'bytes'
}
self.content = ''
async def be_call(self, request, params={}):
return await self.path_call(request, params=params)
async def set_run_env(self, request, params={}):
if self.env_set:
return
self.real_path = self.resource.url2file(request.path)
g = ServerEnv()
self.run_ns = DictObject()
self.run_ns.update(g)
self.run_ns.update(self.resource.y_env)
self.run_ns['request'] = request
self.run_ns['app'] = request.app
kw = await self.run_ns['request2ns']()
kw.update(params)
self.run_ns['params_kw'] = kw
self.run_ns.update(kw)
self.run_ns['ref_real_path'] = self.real_path
self.run_ns['processor'] = self
self.env_set = True
async def execute(self,request):
await self.set_run_env(request)
await self.datahandle(request)
return self.content
def set_response_headers(self, response):
response.headers['Access-Control-Expose-Headers'] = 'Set-Cookie'
# response.headers['Access-Control-Allow-Credentials'] = 'true'
# response.headers['Access-Control-Allow-Origin'] = '47.93.12.75'
async def handle(self,request):
await self.execute(request)
jsonflg = False
if self.retResponse is not None:
self.set_response_headers(self.retResponse)
return self.retResponse
elif isinstance(self.content, Response):
return self.content
elif isinstance(self.content, StreamResponse):
return self.content
elif isinstance(self.content, DictObject):
self.content = json.dumps(self.content, indent=4)
jsonflg = True
elif isinstance(self.content, dict):
self.content = json.dumps(self.content, indent=4)
jsonflg = True
elif isinstance(self.content, list):
self.content = json.dumps(self.content, indent=4)
jsonflg = True
elif isinstance(self.content, tuple):
self.content = json.dumps(self.content, indent=4)
jsonflg = True
elif isinstance(self.content, bytes):
self.headers['Access-Control-Expose-Headers'] = 'Set-Cookie'
self.headers['Content-Length'] = str(len(self.content))
resp = Response(body=self.content,headers=self.headers)
self.set_response_headers(resp)
return resp
else:
try:
json.loads(self.content)
jsonflg = True
except:
pass
if jsonflg:
self.headers['Content-Type'] = "application/json; utf-8"
self.headers['Access-Control-Expose-Headers'] = 'Set-Cookie'
resp = Response(text=self.content,headers=self.headers)
self.set_response_headers(resp)
return resp
async def datahandle(self,request):
debug('*******Error*************')
self.content=''
def setheaders(self):
pass
# self.headers['Content-Length'] = str(len(self.content))
class TemplateProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='tmpl'
async def path_call(self, request, params={}):
await self.set_run_env(request, params=params)
path = request.path
ns = self.run_ns
te = self.run_ns['tmpl_engine']
return await te.render(path,**ns)
async def datahandle(self,request):
self.content = await self.path_call(request)
def setheaders(self):
super(TemplateProcessor,self).setheaders()
if self.path.endswith('.tmpl.css'):
self.headers['Content-Type'] = 'text/css; utf-8'
elif self.path.endswith('.tmpl.js'):
self.headers['Content-Type'] = 'application/javascript ; utf-8'
else:
self.headers['Content-Type'] = 'text/html; utf-8'
class BricksUIProcessor(TemplateProcessor):
@classmethod
def isMe(self,name):
# print(f'{name=} is a bui')
return name=='bui'
async def datahandle(self, request):
params = await self.resource.y_env['request2ns']()
await super().datahandle(request)
if params.get('_webbricks_',None):
return
txt = self.content
entire_url = self.run_ns.get('entire_url')
content0 = await self.resource.path_call(request,entire_url('/bricks/header.tmpl'))
content2 = await self.resource.path_call(request,entire_url('/bricks/footer.tmpl'))
self.content = '%s%s%s' % (content0, txt, content2)
class PythonScriptProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='dspy'
async def loadScript(self, path):
data = ''
async with aiofiles.open(path,'r', encoding='utf-8') as f:
data = await f.read()
b= ''.join(data.split('\r'))
lines = b.split('\n')
lines = ['\t' + l for l in lines ]
txt = "async def myfunc(request,**ns):\n" + '\n'.join(lines)
return txt
async def path_call(self, request,params={}):
await self.set_run_env(request, params=params)
lenv = self.run_ns
del lenv['request']
txt = await self.loadScript(self.real_path)
# print(self.real_path, "#########", txt)
exec(txt,lenv,lenv)
func = lenv['myfunc']
return await func(request,**lenv)
async def datahandle(self,request):
self.content = await self.path_call(request)
class MarkdownProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='md'
async def datahandle(self,request:Request):
data = ''
async with aiofiles.open(self.real_path,'r',encoding='utf-8') as f:
data = await f.read()
self.content = self.urlreplace(data, request)
def urlreplace(self,mdtxt,request):
p = '\[(.*)\]\((.*)\)'
return re.sub(p,
lambda x:'['+x.group(1)+'](' + self.resource.entireUrl(request, x.group(2)) + ')',
mdtxt)
def getProcessor(name):
# print(f'getProcessor({name})')
return _getProcessor(BaseProcessor, name)
def _getProcessor(kclass,name):
for k in kclass.__subclasses__():
if not hasattr(k,'isMe'):
continue
if k.isMe(name):
return k
a = _getProcessor(k,name)
if a is not None:
return a
return None

91
ahserver/configuredServer.py Executable file
View File

@ -0,0 +1,91 @@
import os,sys
from sys import platform
import time
import ssl
from socket import *
from aiohttp import web
from appPublic.folderUtils import ProgramPath
from appPublic.dictObject import DictObject
from appPublic.jsonConfig import getConfig
from appPublic.log import info, debug, warning, error, critical, exception
from appPublic.registerfunction import RegisterCoroutine
from sqlor.dbpools import DBPools
from .processorResource import ProcessorResource
from .auth_api import AuthAPI
from .myTE import setupTemplateEngine
from .globalEnv import initEnv
from .filestorage import TmpFileRecord
from .loadplugins import load_plugins
class AHApp(web.Application):
def __init__(self, *args, **kw):
kw['client_max_size'] = 1024000000
super().__init__(*args, **kw)
self.data = DictObject()
def set_data(self, k, v):
self.data[k] = v
def get_data(self, k):
return self.data.get(k, DictObject())
class ConfiguredServer:
def __init__(self, auth_klass=AuthAPI, workdir=None):
self.auth_klass = auth_klass
self.workdir = workdir
if self.workdir is not None:
pp = ProgramPath()
config = getConfig(self.workdir,
{'workdir':self.workdir,'ProgramPath':pp})
else:
config = getConfig()
if config.databases:
DBPools(config.databases)
self.config = config
initEnv()
setupTemplateEngine()
client_max_size = 1024 * 10240
if config.website.client_max_size:
client_max_size = config.website.client_max_size
self.app = AHApp(client_max_size=client_max_size)
load_plugins(self.workdir)
async def build_app(self):
rf = RegisterCoroutine()
await rf.exe('ahapp_built', self.app)
auth = self.auth_klass()
await auth.setupAuth(self.app)
return self.app
def run(self, port=None):
config = getConfig()
self.configPath(config)
a = TmpFileRecord()
ssl_context = None
if port is None:
port = config.website.port or 8080
if config.website.ssl:
ssl_context = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)
ssl_context.load_cert_chain(config.website.ssl.crtfile,
config.website.ssl.keyfile)
reuse_port = None
if platform != 'win32':
reuse_port = True
print('reuse_port=', reuse_port)
web.run_app(self.build_app(),host=config.website.host or '0.0.0.0',
port=port,
reuse_port=reuse_port,
ssl_context=ssl_context)
def configPath(self,config):
for p,prefix in config.website.paths:
res = ProcessorResource(prefix,p,show_index=True,
follow_symlinks=True,
indexes=config.website.indexes,
processors=config.website.processors)
self.app.router.register_resource(res)

57
ahserver/dbadmin.py Executable file
View File

@ -0,0 +1,57 @@
import os
import re
import traceback
from aiohttp.web_response import Response
from aiohttp.web_exceptions import (
HTTPException,
HTTPExpectationFailed,
HTTPForbidden,
HTTPMethodNotAllowed,
HTTPNotFound,
)
from aiohttp import web
from aiohttp.web_request import Request
from aiohttp.web_routedef import AbstractRouteDef
from aiohttp.web import json_response
from sqlor.crud import CRUD
from appPublic.dictObject import multiDict2Dict
from appPublic.jsonConfig import getConfig
from appPublic.log import info, debug, warning, error, critical, exception
from .error import Error,Success
actions = [
"browse",
"add",
"update",
"filter"
]
class DBAdmin:
def __init__(self, request,dbname,tablename, action):
self.dbname = dbname
self.tablename = tablename
self.request = request
self.action = action
if action not in actions:
debug('action not defined:%s' % action)
raise HTTPNotFound
try:
self.crud = CRUD(dbname,tablename)
except Exception as e:
exception('e= %s' % e)
traceback.print_exc()
raise HTTPNotFound
async def render(self) -> Response:
try:
d = await self.crud.I()
return json_response(Success(d))
except Exception as e:
exception('except=%s' % e)
traceback.print_exc()
return json_response(Error(errno='metaerror',msg='get metadata error'))

67
ahserver/dsProcessor.py Executable file
View File

@ -0,0 +1,67 @@
import codecs
import json
import aiofiles
from appPublic.jsonConfig import getConfig
from appPublic.dictObject import DictObject
from .baseProcessor import BaseProcessor
from .serverenv import ServerEnv
class DataSourceProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='ds'
def __init__(self,filename,k):
super(DataSourceProcessor,self).__init__(filename,k)
self.actions = {
'getdata':self.getData,
'pagingdata':self.getPagingData,
'arguments':self.getArgumentsDesc,
'resultFields':self.getDataDesc,
'gridlist':self.getGridlist,
}
self.g = ServerEnv()
async def getData(self,dict_data,ns,request):pass
async def getPagingData(self,dict_data,ns,request):pass
async def getArgumentsDesc(self,dict_data,ns,request):pass
async def getDataDesc(self,dict_data,ns,request):pass
async def getGridlist(self,dict_data,ns,request):
ret = self.getDataDesc(dict_data,ns,request)
ffs = [ f for f in ret if f.get('frozen',False) ]
fs = [ f for f in ret if not f['frozen'] ]
[ f.update({'hide':True}) for f in ffs if f.get('listhide',False) ]
[ f.update({'hide':True}) for f in fs if f.get('listhide') ]
d = {
"iconCls":"icon-search",
"url":self.resource.absUrl(request,request.path + '?action=pagingdata'),
"view":"bufferview",
"options":{
"pageSize":50,
"pagination":False
}
}
d.update({'fields':fs})
if len(ffs)>0:
d.update({'ffields':ffs})
ret = {
"__ctmpl__":"datagrid",
"data":d
}
return ret
async def path_call(self, request, path):
dict_data = {}
config = getConfig()
async with aiofiles.open(path,'r',encoding=config.website.coding) as f:
b = await f.read()
dict_data = json.loads(b)
ns = self.run_ns
act = ns.get('action','getdata')
action = self.actions.get(act)
return await action(dict_data,ns,request)
async def datahandle(self,request):
self.content = await path_call(request, self.path)

27
ahserver/error.py Executable file
View File

@ -0,0 +1,27 @@
def Error(errno='undefined error',msg='Error'):
return {
"status":"Error",
"data":{
"message":msg,
"errno":errno
}
}
def Success(data):
return {
"status":"OK",
"data":data
}
def NeedLogin(path):
return {
"status":"need_login",
"data":path
}
def NoPermission(path):
return {
"status":"no_permission",
"data":path
}

54
ahserver/filedownload.py Executable file
View File

@ -0,0 +1,54 @@
import os
import asyncio
import mimetypes
from aiohttp.web_exceptions import HTTPNotFound
from aiohttp.web import StreamResponse
from aiohttp import web
import aiofiles
from appPublic.rc4 import RC4
crypto_aim = 'God bless USA and others'
def path_encode(path):
rc4 = RC4()
return rc4.encode(path,crypto_aim)
def path_decode(dpath):
rc4 = RC4()
return rc4.decode(dpath,crypto_aim)
async def file_upload(request):
pass
async def file_download(request, filepath, content_type=None):
filename = os.path.basename(filepath)
r = web.FileResponse(filepath)
ct = content_type
if ct is None:
ct, encoding = mimetypes.guess_type(filepath)
if ct is not None:
r.content_type = ct
else:
r.content_type = 'application/octet-stream'
r.content_disposition = 'attachment; filename=%s' % filename
r.enable_compression()
return r
if os.path.exists(filepath):
length = os.path.getsize(filepath)
response = web.Response(
status=200,
headers = {
'Content-Disposition': 'attrachment;filename={}'.format(filename)
}
)
await response.prepare(request)
cnt = 0
async with aiofiles.open(filepath, 'rb') as f:
chunk = await f.read(10240000)
cnt = cnt + len(chunk)
await response.write(chunk)
await response.fsyn()
await response.write_eof()
return response
raise HTTPNotFound

147
ahserver/filestorage.py Executable file
View File

@ -0,0 +1,147 @@
# fileUpload.py
import asyncio
import os
import time
import tempfile
import aiofiles
import json
import time
from appPublic.folderUtils import _mkdir
from appPublic.jsonConfig import getConfig
from appPublic.Singleton import SingletonDecorator
from appPublic.log import info, debug, warning, exception, critical
@SingletonDecorator
class TmpFileRecord:
def __init__(self, timeout=3600):
self.filetime = {}
self.changed_flg = False
self.timeout = timeout
self.time_period = 10
self.filename = self.savefilename()
self.loop = asyncio.get_event_loop()
self.loop.call_later(0.01, self.load)
def newtmpfile(self, path:str):
self.filetime[path] = time.time()
self.change_flg = True
def savefilename(self):
config = getConfig()
root = config.filesroot or tempfile.gettempdir()
pid = os.getpid()
return root + f'/tmpfile_rec_{pid}.json'
async def save(self):
if not self.change_flg:
return
async with aiofiles.open(self.filename, 'bw') as f:
s = json.dumps(self.filetime)
b = s.encode('utf-8')
await f.write(b)
await f.flush()
self.change_flg = False
async def load(self):
fn = self.filename
if not os.path.isfile(fn):
return
async with aiofiles.open(fn, 'br') as f:
b = await f.read()
s = b.decode('utf-8')
self.filetime = json.loads(s)
self.remove()
def file_useful(self, fpath):
try:
del self.filetime[fpath]
except Exception as e:
exception(f'Exception:{str(e)}')
pass
async def remove(self):
tim = time.time()
ft = {k:v for k,v in self.filetime.items()}
for k,v in ft:
if tim - v > self.timeout:
self.rmfile(k)
del self.tiletime[k]
await self.save()
self.loop.call_later(self.time_period, self.remove)
def rmfile(self, name:str):
config = getConfig()
os.remove(config.fileroot + name)
class FileStorage:
def __init__(self):
config = getConfig()
self.root = os.path.abspath(config.filesroot or tempfile.gettempdir())
self.tfr = TmpFileRecord()
def realPath(self,path):
if path[0] == '/':
path = path[1:]
p = os.path.abspath(os.path.join(self.root,path))
return p
def _name2path(self,name, userid=None):
name = os.path.basename(name)
paths=[191,193,197,97]
v = int(time.time()*1000000)
# b = name.encode('utf8') if not isinstance(name,bytes) else name
# v = int.from_bytes(b,byteorder='big',signed=False)
root = self.root
if userid:
root += f'/{userid}'
path = os.path.abspath(os.path.join(root,
str(v % paths[0]),
str(v % paths[1]),
str(v % paths[2]),
str(v % paths[3]),
name))
return path
def remove(self, path):
try:
if path[0] == '/':
path = path[1:]
p = os.path.join(self.root, path)
os.remove(p)
except Exception as e:
exception(f'{path=}, {p=} remove error')
async def save(self,name,read_data, userid=None):
p = self._name2path(name, userid=userid)
fpath = p[len(self.root):]
info(f'{p=}, {fpath=},{self.root} ')
_mkdir(os.path.dirname(p))
if isinstance(read_data, str) or isinstance(read_data, bytes):
b = read_data
if isinstance(read_data, str):
b = read_data.encode('utf-8')
async with aiofiles.open(p, 'wb') as f:
await f.write(b)
await f.flush()
self.tfr.newtmpfile(fpath)
return fpath
async with aiofiles.open(p,'wb') as f:
siz = 0
while 1:
d = await read_data()
if not d:
break
siz += len(d);
await f.write(d)
await f.flush()
self.tfr.newtmpfile(fpath)
return fpath
def file_realpath(path):
fs = FileStorage()
return fs.realPath(path)

14
ahserver/filetest.py Normal file
View File

@ -0,0 +1,14 @@
import os
def current_fileno():
fn = './t.txt'
f = open(fn, 'w')
ret = f.fileno()
f.close()
os.remove(fn)
return ret
if __name__ == '__main__':
for i in range(1000):
print(current_fileno())

49
ahserver/functionProcessor.py Executable file
View File

@ -0,0 +1,49 @@
import inspect
from appPublic.dictObject import DictObject
from appPublic.registerfunction import RegisterFunction
from appPublic.log import info, debug, warning, error, exception, critical
from aiohttp import web
from aiohttp.web_response import Response, StreamResponse
from .baseProcessor import BaseProcessor
class FunctionProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return False
def __init__(self,path,resource, opts):
self.config_opts = opts
BaseProcessor.__init__(self,path,resource)
async def path_call(self, request, path):
path1 = request.path[len(self.config_opts['leading']):]
args = []
if len(path1) > 0:
if path1[0] == '/':
path1 = path1[1:]
args += path1.split('/')
rfname = self.config_opts['registerfunction']
ns = DictObject(**self.run_ns)
rf = RegisterFunction()
f = rf.get(rfname)
if f is None:
error(f'{rfname=} is not registered, {rf.registKW=}')
return None
self.run_ns['request'] = request
globals().update(self.run_ns)
if inspect.iscoroutinefunction(f):
return await f(request, self.run_ns, *args)
return f(request, self.run_ns, *args)
async def datahandle(self,request):
x = await self.path_call(request, self.path)
if isinstance(x,web.FileResponse):
self.retResponse = x
elif isinstance(x,Response):
self.retResponse = x
else:
self.content = x

213
ahserver/globalEnv.py Executable file
View File

@ -0,0 +1,213 @@
# -*- coding:utf8 -*-
import os
import builtins
import sys
import codecs
from urllib.parse import quote
import json
import asyncio
import random
import time
import datetime
from openpyxl import Workbook
from tempfile import mktemp
from appPublic.jsonConfig import getConfig
from appPublic.dictObject import DictObject
from appPublic.Singleton import GlobalEnv
from appPublic.argsConvert import ArgsConvert
from appPublic.timeUtils import str2Date,str2Datetime,curDatetime,getCurrentTimeStamp,curDateString, curTimeString
from appPublic.dataencoder import quotedstr
from appPublic.folderUtils import folderInfo
from appPublic.uniqueID import setNode,getID
from appPublic.unicoding import unicoding,uDict,uObject
from appPublic.Singleton import SingletonDecorator
from appPublic.rc4 import password
from sqlor.dbpools import DBPools,runSQL,runSQLPaging
from sqlor.filter import DBFilter, default_filterjson
from sqlor.crud import CRUD
from .xlsxData import XLSXData
from .uriop import URIOp
from .error import Success, Error, NeedLogin, NoPermission
from .filetest import current_fileno
from .filestorage import FileStorage
from .serverenv import ServerEnv
def data2xlsx(rows,headers=None):
wb = Workbook()
ws = wb.active
i = 1
if headers is not None:
for j in range(len(headers)):
v = headers[j].title if headers[j].get('title',False) else headers[j].name
ws.cell(column=j+1,row=i,value=v)
i += 1
for r in rows:
for j in range(len(r)):
v = r[headers[j].name]
ws.cell(column=j+1,row=i,value=v)
i += 1
name = mktemp(suffix='.xlsx')
wb.save(filename = name)
wb.close()
return name
async def save_file(str_or_bytes, filename):
fs = FileStorage()
r = await fs.save(filename, str_or_bytes)
return r
def realpath(path):
fs = FileStorage()
return fs.realPath(path)
class FileOutZone(Exception):
def __init__(self,fp,*args,**kwargs):
super(FileOutZone,self).__init__(*args,**kwargs)
self.openfilename = fp
def __str__(self):
return self.openfilename + ': not allowed to open'
def get_config_value(kstr):
keys = kstr.split('.')
config = getConfig()
if config is None:
raise Exception('getConfig() error')
for k in keys:
config = config.get(k)
if not config:
return None
return config
def get_definition(k):
k = f'definitions.{k}'
return get_config_value(k)
def openfile(url,m):
fp = abspath(url)
if fp is None:
print(f'openfile({url},{m}),url is not match a file')
raise Exception('url can not mathc a file')
config = getConfig()
paths = [ os.path.abspath(p) for p in config.website.paths ]
fs = config.get('allow_folders',[])
fs = [ os.path.abspath(i) for i in fs + paths ]
r = False
for f in fs:
if fp.startswith(f):
r = True
break
if not r:
raise FileOutZone(fp)
return open(fp,m)
def isNone(a):
return a is None
def abspath(path):
config = getConfig()
paths = [ os.path.abspath(p) for p in config.website.paths ]
for root in paths:
p = root + path
if os.path.exists(root+path):
return p
return None
def appname():
config = getConfig()
try:
return config.license.app
except:
return "test app"
def configValue(ks):
config = getConfig()
try:
a = eval('config' + ks)
return a
except:
return None
def visualcoding():
return configValue('.website.visualcoding');
def file_download(request,path,name,coding='utf8'):
f = openfile(path,'rb')
b = f.read()
f.close()
fname = quote(name).encode(coding)
hah = b"attachment; filename=" + fname
# print('file head=',hah.decode(coding))
request.setHeader(b'Content-Disposition',hah)
request.setHeader(b'Expires',0)
request.setHeader(b'Cache-Control',b'must-revalidate, post-check=0, pre-check=0')
request.setHeader(b'Content-Transfer-Encoding',b'binary')
request.setHeader(b'Pragma',b'public')
request.setHeader(b'Content-Length',len(b))
request.write(b)
request.finish()
def initEnv():
pool = DBPools()
g = ServerEnv()
set_builtins()
g.configValue = configValue
g.visualcoding = visualcoding
g.uriop = URIOp
g.isNone = isNone
g.json = json
g.ArgsConvert = ArgsConvert
g.time = time
g.curDateString = curDateString
g.curTimeString = curTimeString
g.datetime = datetime
g.random = random
g.str2date = str2Date
g.str2datetime = str2Datetime
g.curDatetime = curDatetime
g.uObject = uObject
g.uuid = getID
g.runSQL = runSQL
g.runSQLPaging = runSQLPaging
g.runSQLIterator = pool.runSQL
g.runSQLResultFields = pool.runSQLResultFields
g.getTables = pool.getTables
g.getTableFields = pool.getTableFields
g.getTablePrimaryKey = pool.getTablePrimaryKey
g.getTableForignKeys = pool.getTableForignKeys
g.folderInfo = folderInfo
g.abspath = abspath
g.data2xlsx = data2xlsx
g.xlsxdata = XLSXData
g.openfile = openfile
g.CRUD = CRUD
g.DBPools = DBPools
g.DBFilter = DBFilter
g.default_filterjson = default_filterjson
g.Error = Error
g.Success = Success
g.NeedLogin = NeedLogin
g.NoPermission = NoPermission
g.password_encode = password
g.current_fileno = current_fileno
g.get_config_value = get_config_value
g.get_definition = get_definition
g.DictObject = DictObject
g.async_sleep = asyncio.sleep
g.quotedstr = quotedstr
g.save_file = save_file
g.realpath = realpath
def set_builtins():
all_builtins = [ i for i in dir(builtins) if not i.startswith('_')]
g = ServerEnv()
gg = globals()
for l in all_builtins:
exec(f'g["{l}"] = {l}',{'g':g})

81
ahserver/llmProcessor.py Executable file
View File

@ -0,0 +1,81 @@
import aiohttp
from aiohttp import web, BasicAuth
from aiohttp import client
from appPublic.dictObject import DictObject
from .llm_client import StreamLlmProxy, AsyncLlmProxy, SyncLlmProxy
from .baseProcessor import *
class LlmProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='llm'
async def path_call(self, request, params={}):
await self.set_run_env(request)
path = self.path
url = self.resource.entireUrl(request, path)
ns = self.run_ns
ns.update(params)
te = self.run_ns['tmpl_engine']
txt = await te.render(url,**ns)
data = json.loads(txt)
return DictObject(**data)
async def datahandle(self,request):
chunk_size = 40960
d = await self.path_call(request)
llm = StreamLlmProxy(self, d)
self.retResponse = await llm(request, self.run_ns.params_kw)
def setheaders(self):
pass
class LlmSProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='llms'
async def path_call(self, request, params={}):
await self.set_run_env(request)
path = self.path
url = self.resource.entireUrl(request, path)
ns = self.run_ns
ns.update(params)
te = self.run_ns['tmpl_engine']
txt = await te.render(url,**ns)
data = json.loads(txt)
return DictObject(**data)
async def datahandle(self,request):
chunk_size = 40960
d = await self.path_call(request)
llm = SyncLlmProxy(self, d)
self.content = await llm(request, self.run_ns.params_kw)
def setheaders(self):
pass
class LlmAProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='llma'
async def path_call(self, request, params={}):
await self.set_run_env(request)
path = self.path
url = self.resource.entireUrl(request, path)
ns = self.run_ns
ns.update(params)
te = self.run_ns['tmpl_engine']
txt = await te.render(url,**ns)
data = json.loads(txt)
return DictObject(**data)
async def datahandle(self,request):
chunk_size = 40960
d = await self.path_call(request)
llm = AsyncLlmProxy(self, d)
self.retResponse = await llm(request, self.run_ns.params_kw)
def setheaders(self):
pass

288
ahserver/llm_client.py Normal file
View File

@ -0,0 +1,288 @@
import re
import base64
import json
from traceback import print_exc
from aiohttp import web
from appPublic.dictObject import DictObject
from appPublic.httpclient import HttpClient, RESPONSE_TEXT, RESPONSE_JSON, RESPONSE_BIN,RESPONSE_FILE, RESPONSE_STREAM
from appPublic.argsConvert import ArgsConvert
def encode_imagefile(fn):
with open(fn, 'rb') as f:
return base64.b64encode(f.read()).decode('utf-8')
class StreamLlmProxy:
def __init__(self, processor, desc):
assert desc.name
self.name = desc.name
self.processor = processor
self.auth_api = desc.auth
self.desc = desc
self.api_name = desc.name
self.data = DictObject()
self.ac = ArgsConvert('${', '}')
def line_chunk_match(self, l):
if self.api.chunk_match:
match = re.search(self.api.chunk_match, l)
if match:
return match.group(1)
return l
async def write_chunk(self, ll):
def eq(a, b):
return a == b
def ne(a, b):
return a != b
opfuncs = {
'==':eq,
'!=':ne
}
if '[DONE]' in ll:
return
try:
# print('write_chunk(),l=', ll)
l = self.line_chunk_match(ll)
d = DictObject(** json.loads(l))
j = {}
for r in self.api.resp or []:
j[r.name] = d.get_data_by_keys(r.value);
if self.api.chunk_filter:
v = d.get_data_by_keys(self.api.chunk_filter.name)
v1 = self.api.chunk_filter.value
op = self.api.chunk_filter.op
f = opfuncs.get(op)
if f and f(v,v1):
j[self.api.chunk_filter.field] = ''
print('filtered j=', j)
jstr = json.dumps(j) + '\n'
bin = jstr.encode('utf-8')
await self.resp.write(bin)
await self.resp.drain()
except Exception as e:
print(f'Error:Write_chunk(),{l=} error:{e=}')
print_exc()
async def stream_handle(self, chunk):
print('chunk=', chunk)
chunk = chunk.decode('utf-8')
chunk = self.remain_str + chunk
lines = chunk.split('\n')
self.remain_str = lines[-1]
ls = lines[:-1]
for l in ls:
if l == '':
continue
await self.write_chunk(l)
async def get_apikey(self, apiname):
f = self.processor.run_ns.get_llm_user_apikey
if f:
# return a DictObject instance
return await f(apiname, self.user)
raise Exception('get_llm_user_apikey() function not found in ServerEnv')
async def do_auth(self, request):
d = self.desc.auth
self.data = self.get_data(self.name)
if self.data.authed:
return
self.data = await self.get_apikey(self.name)
if self.data is None:
raise Exception(f'user({self.user}) do not has a apikey for {self.name}')
method = d.get('method', 'POST')
headers = {}
for h in d.get('headers',{}):
headers[h.get('name')] = h.get('value')
mydata = {}
for p in d.data or []:
mydata[p.get('name')] = p.get('value')
myparams = {}
for p in d.params or []:
myparams[p.get('name')] = p.get('value')
url = d.get('url')
params = {}
_params = self.datalize(myparams, params)
_headers = self.datalize(headers, params)
_data = self.datalize(mydata, params)
hc = HttpClient()
resp_data = await hc.request(url, method, response_type=RESPONSE_JSON,
params=_params,
data=None if _data == {} else json.dumps(_data),
headers=_headers)
resp_data = DictObject(**resp_data)
for sd in d.set_data:
self.data[sd.name] = resp_data.get_data_by_keys(sd.field)
self.data.authed = True
self.set_data(self.name, self.data)
def data_key(self, apiname):
if self.user is None:
self.user = 'anonymous'
return apiname + '_a_' + self.user
def set_data(self, apiname, data):
request = self.processor.run_ns.request
app = request.app
app.set_data(self.data_key(apiname), data)
def get_data(self, apiname):
request = self.processor.run_ns.request
app = request.app
return app.get_data(self.data_key(apiname))
async def __call__(self, request, params):
self.user = await self.processor.run_ns.get_user()
mapi = params.mapi
stream = params.stream
self.resp = web.StreamResponse()
await self.resp.prepare(request)
if stream is None:
stream = True
self.remain_str = ''
if not self.desc[mapi]:
raise Exception(f'{mapi} not defined')
d = self.desc[mapi]
self.api = d
self.chunk_match = d.chunk_match
if self.api.need_auth and self.auth_api:
await self.do_auth(request)
else:
self.data = await self.get_apikey(self.name)
assert d.get('url')
method = d.get('method', 'POST')
headers = {}
for h in d.get('headers',{}):
headers[h.get('name')] = h.get('value')
mydata = {}
for p in d.get('data', {}):
mydata[p.get('name')] = p.get('value')
myparams = {}
for p in d.get('params', {}):
myparams[p.get('name')] = p.get('value')
url = d.get('url')
_params = self.datalize(myparams, params)
_headers = self.datalize(headers, params)
_data = self.datalize(mydata, params)
response_type = RESPONSE_STREAM
hc = HttpClient()
print(f'{url=},{method=},{_params=},{_data=},{_headers=}')
resp_data = await hc.request(url, method, response_type=response_type,
params=_params,
data=None if _data == {} else json.dumps(_data),
stream_func=self.stream_handle,
headers=_headers)
if self.remain_str != '':
await self.write_chunk(self.remain_str)
return self.resp
def datalize(self, dic, data={}):
mydata = self.data.copy()
mydata.update(data)
s1 = self.ac.convert(dic, mydata)
return s1
class SyncLlmProxy(StreamLlmProxy):
async def __call__(self, request, params):
self.user = await self.processor.run_ns.get_user()
mapi = params.mapi
if not self.desc[mapi]:
return {
"status":"Error",
"message":f'{mapi} not defined'
}
d = self.desc[mapi]
self.api = d
if self.api.need_auth and self.auth_api:
await self.do_auth(request)
else:
self.data = await self.get_apikey(self.name)
assert d.get('url')
method = d.get('method', 'POST')
headers = {}
for h in d.get('headers',{}):
headers[h.get('name')] = h.get('value')
mydata = {}
for p in d.get('data', {}):
mydata[p.get('name')] = p.get('value')
myparams = {}
for p in d.get('params', {}):
myparams[p.get('name')] = p.get('value')
url = d.get('url')
_params = self.datalize(myparams, params)
_headers = self.datalize(headers, params)
_data = self.datalize(mydata, params)
response_type = RESPONSE_JSON
hc = HttpClient()
print(f'{url=},{method=},{_params=},{_data=},{_headers=}')
resp_data = await hc.request(url, method, response_type=response_type,
params=_params,
data=None if _data == {} else json.dumps(_data),
headers=_headers)
print(f'{resp_data=}')
resp_data = DictObject(resp_data)
if resp_data is None:
return {
"status":"Error",
"message":f'{mapi} not defined'
}
return self.convert_resp(resp_data)
def convert_resp(self, resp):
j = {}
for r in self.api.resp or []:
j[r.name] = resp.get_data_by_keys(r.value);
return j
class AsyncLlmProxy(StreamLlmProxy):
pass
class AsyncLlmProxy:
async def __call__(self, request, params):
self.user = await self.processor.run_ns.get_user()
mapi = params.mapi
stream = params.stream
self.resp = web.StreamResponse()
await self.resp.prepare(request)
if stream is None:
stream = True
self.remain_str = ''
if not self.desc[mapi]:
raise Exception(f'{mapi} not defined')
d = self.desc[mapi]
self.api = d
self.chunk_match = d.chunk_match
if self.api.need_auth and self.auth_api:
await self.do_auth(request)
else:
self.data = await self.get_apikey(self.name)
assert d.get('url')
method = d.get('method', 'POST')
headers = {}
for h in d.get('headers',{}):
headers[h.get('name')] = h.get('value')
mydata = {}
for p in d.get('data', {}):
mydata[p.get('name')] = p.get('value')
myparams = {}
for p in d.get('params', {}):
myparams[p.get('name')] = p.get('value')
url = d.get('url')
_params = self.datalize(myparams, params)
_headers = self.datalize(headers, params)
_data = self.datalize(mydata, params)
response_type = RESPONSE_JSON
hc = HttpClient()
print(f'{url=},{method=},{_params=},{_data=},{_headers=}')
resp_data = await hc.request(url, method, response_type=response_type,
params=_params,
data=None if _data == {} else json.dumps(_data),
headers=_headers)
if self.remain_str != '':
await self.write_chunk(self.remain_str)
return self.resp

29
ahserver/loadplugins.py Normal file
View File

@ -0,0 +1,29 @@
import os
import sys
from appPublic.folderUtils import listFile
from appPublic.ExecFile import ExecFile
from ahserver.serverenv import ServerEnv
import appPublic
import sqlor
import ahserver
def load_plugins(p_dir):
ef = ExecFile()
pdir = os.path.join(p_dir, 'plugins')
if not os.path.isdir(pdir):
# print('load_plugins:%s not exists' % pdir)
return
sys.path.append(pdir)
ef.set('sys',sys)
ef.set('ServerEnv', ServerEnv)
for m in listFile(pdir, suffixs='.py'):
if m == '__init__.py':
continue
if not m.endswith('.py'):
continue
# print(f'{m=}')
module = os.path.basename(m[:-3])
# print('module=', module)
__import__(module, locals(), globals())

59
ahserver/myTE.py Executable file
View File

@ -0,0 +1,59 @@
import os
import codecs
from appPublic.Singleton import SingletonDecorator
from appPublic.jsonConfig import getConfig
from jinja2 import Template,Environment, BaseLoader
from .serverenv import ServerEnv
from .url2file import Url2File, TmplUrl2File
class TmplLoader(BaseLoader, TmplUrl2File):
def __init__(self, paths, indexes, subffixes=['.tmpl'], inherit=False):
BaseLoader.__init__(self)
TmplUrl2File.__init__(self,paths,indexes=indexes,subffixes=subffixes, inherit=inherit)
def get_source(self,env: Environment,template: str):
config = getConfig()
coding = config.website.coding
fp = self.url2file(template)
# print(f'{template=} can not transfer to filename')
if not os.path.isfile(fp):
raise TemplateNotFound(template)
mtime = os.path.getmtime(fp)
with codecs.open(fp,'r',coding) as f:
source = f.read()
return source,fp,lambda:mtime == os.path.getmtime(fp)
def join_path(self,name, parent):
return self.relatedurl(parent,name)
def list_templates(self):
return []
class TemplateEngine(Environment):
def __init__(self,loader=None):
Environment.__init__(self,loader=loader, enable_async=True)
self.urlpaths = {}
self.loader = loader
def join_path(self,template: str, parent: str):
return self.loader.join_path(template, parent)
async def render(self,___name: str, **globals):
t = self.get_template(___name,globals=globals)
return await t.render_async(globals)
def setupTemplateEngine():
config = getConfig()
subffixes = [ i[0] for i in config.website.processors if i[1] == 'tmpl' ]
loader = TmplLoader(config.website.paths,
config.website.indexes,
subffixes,
inherit=True)
engine = TemplateEngine(loader)
g = ServerEnv()
g.tmpl_engine = engine

39
ahserver/p2p_middleware.py Executable file
View File

@ -0,0 +1,39 @@
from aiohttp import web
from p2psc.pubkey_handler import PubkeyHandler
from p2psc.p2psc import P2psc
class P2pLayer
def __init__(self):
self.p2pcrypt = False
config = getConfig()
if config.website.p2pcrypt:
self.p2pcrypt = True
if not self.p2pcrypt:
return
self.handler = PubkeyHandler()
self.p2p = P2psc(self.handler, self.handler.get_myid())
@web.middleware
async def p2p_middle(self, request, handler):
if not p2pscrypr:
return await handler(request)
if request.headers.get('P2pHandShake', None):
resturen await self.p2p_handshake(request)
if request.header.get('P2pdata', None):
request = await self.p2p_decode_request(request)
resp = await handler(request)
return await self.p2p_encode_response(resp)
return handler(request)
async def p2p_handshake(self, request):
pass
async def p2p_decode_request(self, request):
pass
async def p2p_encode_response(self, response):
return response

446
ahserver/processorResource.py Executable file
View File

@ -0,0 +1,446 @@
import os
import re
import codecs
import aiofiles
from traceback import print_exc
# from showcallstack import showcallstack
import asyncio
import json
from yarl import URL
from aiohttp import client
from aiohttp_auth import auth
from appPublic.http_client import Http_Client
from functools import partial
from aiohttp_auth import auth
from aiohttp.web_urldispatcher import StaticResource, PathLike
from aiohttp.web_urldispatcher import Optional, _ExpectHandler
from aiohttp.web_urldispatcher import Path
from aiohttp.web_response import Response, StreamResponse
from aiohttp.web_exceptions import (
HTTPException,
HTTPExpectationFailed,
HTTPForbidden,
HTTPMethodNotAllowed,
HTTPNotFound,
HTTPFound,
)
from aiohttp.web_fileresponse import FileResponse
from aiohttp.web_request import Request
from aiohttp.web_response import Response, StreamResponse
from aiohttp.web_routedef import AbstractRouteDef
from appPublic.jsonConfig import getConfig
from appPublic.dictObject import DictObject
from appPublic.i18n import getI18N
from appPublic.dictObject import DictObject, multiDict2Dict
from appPublic.timecost import TimeCost
from appPublic.timeUtils import timestampstr
from appPublic.log import info, debug, warning, error, critical, exception
from .baseProcessor import getProcessor, BricksUIProcessor, TemplateProcessor
from .baseProcessor import PythonScriptProcessor, MarkdownProcessor
from .xlsxdsProcessor import XLSXDataSourceProcessor
from .llmProcessor import LlmProcessor, LlmSProcessor, LlmAProcessor
from .websocketProcessor import WebsocketProcessor, XtermProcessor
from .sqldsProcessor import SQLDataSourceProcessor
from .functionProcessor import FunctionProcessor
from .proxyProcessor import ProxyProcessor
from .serverenv import ServerEnv
from .url2file import Url2File
from .filestorage import FileStorage, file_realpath
from .restful import DBCrud
from .dbadmin import DBAdmin
from .filedownload import file_download, path_decode
from .utils import unicode_escape
from .filetest import current_fileno
from .auth_api import user_login, user_logout, get_session_user
def getHeaderLang(request):
al = request.headers.get('Accept-Language')
if al is None:
return 'en'
return al.split(',')[0]
def i18nDICT(request):
c = getConfig()
i18n = getI18N()
lang = getHeaderLang(request)
l = c.langMapping.get(lang,lang)
return json.dumps(i18n.getLangDict(l)).encode(c.website.coding)
class ProcessorResource(StaticResource,Url2File):
def __init__(self, prefix: str, directory: PathLike,
*, name: Optional[str]=None,
expect_handler: Optional[_ExpectHandler]=None,
chunk_size: int=256 * 1024,
show_index: bool=False, follow_symlinks: bool=False,
append_version: bool=False,
indexes:list=[],
processors:dict={}) -> None:
StaticResource.__init__(self,prefix, directory,
name=name,
expect_handler=expect_handler,
chunk_size=chunk_size,
show_index=show_index,
follow_symlinks=follow_symlinks,
append_version=append_version)
Url2File.__init__(self,directory,prefix,indexes,inherit=True)
gr = self._routes.get('GET')
self._routes.update({'POST':gr})
self._routes.update({'PUT':gr})
self._routes.update({'OPTIONS':gr})
self._routes.update({'DELETE':gr})
self._routes.update({'TRACE':gr})
self.y_processors = processors
self.y_prefix = prefix
self.y_directory = directory
self.y_indexes = indexes
self.y_env = DictObject()
def setProcessors(self, processors):
self.y_processors = processors
def setIndexes(self, indexes):
self.y_indexes = indexes
def abspath(self, request, path:str):
url = self.entireUrl(request, path)
path = self.url2path(url)
fname = self.url2file(path)
return fname
async def getPostData(self,request: Request) -> DictObject:
qd = {}
if request.query:
qd = multiDict2Dict(request.query)
reader = None
try:
reader = await request.multipart()
except:
# print('reader is None')
pass
if reader is None:
pd = await request.post()
pd = multiDict2Dict(pd)
if pd == {}:
if request.can_read_body:
x = await request.read()
try:
pd = json.loads(x)
except:
# print('body is not a json')
pass
qd.update(pd)
return DictObject(**qd)
ns = qd
while 1:
try:
field = await reader.next()
if not field:
break
value = ''
if hasattr(field,'filename') and field.filename is not None:
saver = FileStorage()
userid = await get_session_user(request)
value = await saver.save(field.filename,field.read_chunk, userid=userid)
else:
value = await field.read(decode=True)
value = value.decode('utf-8')
ov = ns.get(field.name)
if ov:
if type(ov) == type([]):
ov.append(value)
else:
ov = [ov,value]
else:
ov = value
ns.update({field.name:ov})
# print(f'getPostData():{ns=}')
except Exception as e:
print(e)
print_exc()
print('-----------except out ------------')
break;
return DictObject(ns)
def parse_request(self, request):
"""
get real schema, host, port, prepath
and save it to self._{attr}
"""
self._scheme = request.scheme
self._scheme = request.headers.get('X-Forwarded-Scheme',request.scheme)
k = request.host.split(':')
host = k[0]
port = 80
if len(k) == 2:
port = int(k[1])
elif self._scheme.lower() == 'https':
port = 443
self._host = request.headers.get('X-Forwarded-Host', host)
self._port = request.headers.get('X-Forwarded-Port', port)
self._prepath = request.headers.get('X-Forwarded-Prepath', '')
if self._prepath != '':
self._prepath = '/' + self._prepath
self._preurl = f'{self._scheme}://{self._host}:{self._port}{self._prepath}'
# print(f'{request.path=}, {self._preurl=}')
async def _handle(self,request:Request) -> StreamResponse:
clientkeys = {
"iPhone":"iphone",
"iPad":"ipad",
"Android":"androidpad",
"Windows Phone":"winphone",
"Windows NT[.]*Win64; x64":"pc",
}
def i18nDICT():
c = getConfig()
g = ServerEnv()
if not g.get('myi18n',False):
g.myi18n = getI18N()
lang = getHeaderLang(request)
l = c.langMapping.get(lang,lang)
return json.dumps(g.myi18n.getLangDict(l))
def getClientType(request):
agent = request.headers.get('user-agent')
if type(agent)!=type('') and type(agent)!=type(b''):
return 'pc'
for k in clientkeys.keys():
m = re.findall(k,agent)
if len(m)>0:
return clientkeys[k]
return 'pc'
def serveri18n(s):
lang = getHeaderLang(request)
c = getConfig()
g = ServerEnv()
if not g.get('myi18n',False):
g.myi18n = getI18N()
l = c.langMapping.get(lang,lang)
return g.myi18n(s,l)
async def getArgs() -> DictObject:
if request.method == 'POST':
return await self.getPostData(request)
ns = multiDict2Dict(request.query)
return DictObject(**ns)
async def redirect(url):
url = self.entireUrl(request, url)
raise HTTPFound(url)
async def remember_user(userid):
await user_login(request, userid)
async def remember_ticket(ticket):
await auth.remember_ticket(request, ticket)
async def get_ticket():
return await auth.get_ticket(request)
async def forget_user():
await user_logout(request)
async def get_user():
return await get_session_user(request)
self.parse_request(request)
self.y_env.i18n = serveri18n
self.y_env.file_realpath = file_realpath
self.y_env.redirect = redirect
self.y_env.info = info
self.y_env.error = error
self.y_env.debug = debug
self.y_env.warning = warning
self.y_env.critical = critical
self.y_env.exception = exception
self.y_env.remember_user = remember_user
self.y_env.forget_user = forget_user
self.y_env.get_user = get_user
self.y_env.i18nDict = i18nDICT
self.y_env.terminalType = getClientType(request)
self.y_env.entire_url = partial(self.entireUrl,request)
self.y_env.websocket_url = partial(self.websocketUrl,request)
self.y_env.abspath = self.abspath
self.y_env.request2ns = getArgs
self.y_env.aiohttp_client = client
self.y_env.resource = self
self.y_env.gethost = partial(self.gethost, request)
self.y_env.path_call = partial(self.path_call,request)
self.user = await auth.get_auth(request)
self.y_env.user = self.user
self.request_filename = self.url2file(str(request.path))
request['request_filename'] = self.request_filename
path = request.path
config = getConfig()
request['port'] = config.website.port
if config.website.dbadm and path.startswith(config.website.dbadm):
pp = path.split('/')[2:]
if len(pp)<3:
error('%s:not found' % str(request.url))
raise HTTPNotFound
dbname = pp[0]
tablename = pp[1]
action = pp[2]
adm = DBAdmin(request,dbname,tablename,action)
return await adm.render()
if config.website.dbrest and path.startswith(config.website.dbrest):
pp = path.split('/')[2:]
if len(pp)<2:
error('%s:not found' % str(request.url))
raise HTTPNotFound
dbname = pp[0]
tablename = pp[1]
id = None
if len(pp) > 2:
id = pp[2]
crud = DBCrud(request,dbname,tablename,id=id)
return await crud.dispatch()
if config.website.download and path.startswith(config.website.download):
pp = path.split('/')[2:]
if len(pp)<1:
error('%s:not found' % str(request.url))
raise HTTPNotFound
dp = '/'.join(pp)
path = path_decode(dp)
return await file_download(request, path)
processor = self.url2processor(request, str(request.url), self.request_filename)
if processor:
ret = await processor.handle(request)
return ret
if self.request_filename and await self.isHtml(self.request_filename):
return await self.html_handle(request, self.request_filename)
if self.request_filename and os.path.isdir(self.request_filename):
config = getConfig()
if not config.website.allowListFolder:
error('%s:not found' % str(request.url))
raise HTTPNotFound
# print(f'{self.request_filename=}, {str(request.url)=} handle as a normal file')
return await super()._handle(request)
def gethost(self, request):
host = request.headers.get('X-Forwarded-Host')
if host:
return host
host = request.headers.get('Host')
if host:
return host
return '/'.join(str(request.url).split('/')[:3])
async def html_handle(self,request,filepath):
async with aiofiles.open(filepath,'r', encoding='utf-8') as f:
txt = await f.read()
utxt = txt.encode('utf-8')
headers = {
'Content-Type': 'text/html; utf-8',
'Accept-Ranges': 'bytes',
'Content-Length': str(len(utxt))
}
resp = Response(text=txt,headers=headers)
return resp
async def isHtml(self,fn):
try:
async with aiofiles.open(fn,'r',encoding='utf-8') as f:
b = await f.read()
while b[0] in ['\n',' ','\t']:
b = b[1:]
if b.lower().startswith('<html>'):
return True
if b.lower().startswith('<!doctype html>'):
return True
except Exception as e:
return False
def url2processor(self, request, url, fpath):
config = getConfig()
url1 = url
url = self.entireUrl(request, url)
host = '/'.join(url.split('/')[:3])
path = '/' + '/'.join(url.split('/')[3:])
if config.website.startswiths:
for a in config.website.startswiths:
if path.startswith(a.leading):
processor = FunctionProcessor(path,self,a)
return processor
if fpath is None:
print(f'fpath is None ..., {url=}, {url1=}')
return None
for word, handlername in self.y_processors:
if fpath.endswith(word):
Klass = getProcessor(handlername)
try:
processor = Klass(path,self)
# print(f'{f_cnt1=}, {f_cnt2=}, {f_cnt3=}, {f_cnt4=}, {f_cnt5=}')
return processor
except Exception as e:
print('Exception:',e, 'handlername=', handlername)
return None
return None
def websocketUrl(self, request, url):
url = entireUrl(request, url)
if url.startswith('https'):
return 'wss' + url[5:]
return 'ws' + url[4:]
def urlWebsocketify(self, url):
if url.endswith('.ws') or url.endswith('.wss'):
if url.startswith('https'):
return 'wss' + url[5:]
return 'ws' + url[4:]
return url
def entireUrl(self, request, url):
ret_url = ''
if url.startswith('http://') or \
url.startswith('https://') or \
url.startswith('ws://') or \
url.startswith('wss://'):
ret_url = url
elif url.startswith('/'):
u = f'{self._preurl}{url}'
# print(f'entireUrl(), {u=}, {url=}, {self._preurl=}')
ret_url = u
else:
path = request.path
p = self.relatedurl(path,url)
u = f'{self._preurl}{p}'
ret_url = u
return self.urlWebsocketify(ret_url)
def url2path(self, url):
if url.startswith(self._preurl):
return url[len(self._preurl):]
return url
async def path_call(self, request, path, params={}):
url = self.entireUrl(request, path)
# print(f'{path=}, after entireUrl(), {url=}')
path = self.url2path(url)
fpath = self.url2file(path)
processor = self.url2processor(request, path, fpath)
# print(f'path_call(), {path=}, {url=}, {fpath=}, {processor=}, {self._prepath}')
new_request = request.clone(rel_url=path)
return await processor.be_call(new_request, params=params)

53
ahserver/proxyProcessor.py Executable file
View File

@ -0,0 +1,53 @@
import aiohttp
from appPublic.log import info, debug, warning, error, critical, exception
from aiohttp import web, BasicAuth
from aiohttp import client
from .baseProcessor import *
class ProxyProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='proxy'
async def path_call(self, request, params={}):
await self.set_run_env(request)
path = self.path
url = self.resource.entireUrl(request, path)
ns = self.run_ns
ns.update(params)
te = self.run_ns['tmpl_engine']
txt = await te.render(url,**ns)
data = json.loads(txt)
debug('proxyProcessor: data=%s' % data)
return data
async def datahandle(self,request):
chunk_size = 40960
d = await self.path_call(request)
reqH = request.headers.copy()
auth = None
if d.get('user') and d.get('password'):
auth = BasicAuth(d['user'], d['password'])
async with client.request(
request.method,
d['url'],
auth=auth,
headers = reqH,
allow_redirects=False,
data=await request.read()) as res:
headers = res.headers.copy()
# body = await res.read()
self.retResponse = web.StreamResponse(
headers = headers,
status = res.status
# ,body=body
)
await self.retResponse.prepare(request)
async for chunk in res.content.iter_chunked(chunk_size):
await self.retResponse.write(chunk)
debug('proxy: datahandle() finish')
def setheaders(self):
pass

121
ahserver/restful.py Executable file
View File

@ -0,0 +1,121 @@
import os
import re
import traceback
from aiohttp.web_response import Response
from aiohttp.web_exceptions import (
HTTPException,
HTTPExpectationFailed,
HTTPForbidden,
HTTPMethodNotAllowed,
HTTPNotFound,
)
from aiohttp import web
from aiohttp.web_request import Request
from aiohttp.web import json_response
from sqlor.dbpools import DBPools
from appPublic.dictObject import multiDict2Dict
from appPublic.jsonConfig import getConfig
from .error import Error,Success
DEFAULT_METHODS = ('GET', 'POST', 'PUT', 'DELETE', 'HEAD', 'OPTIONS', 'TRACE')
class RestEndpoint:
def __init__(self):
self.methods = {}
for method_name in DEFAULT_METHODS:
method = getattr(self, method_name.lower(), None)
if method:
self.register_method(method_name, method)
def register_method(self, method_name, method):
self.methods[method_name.upper()] = method
async def dispatch(self):
method = self.methods.get(self.request.method.lower())
if not method:
raise HTTPMethodNotAllowed('', DEFAULT_METHODS)
return await method()
class DBCrud(RestEndpoint):
def __init__(self, request,dbname,tablename, id=None):
super().__init__()
self.dbname = dbname
self.tablename = tablename
self.request = request
self.db = DBPools()
self.id = id
async def options(self) -> Response:
try:
with self.db.sqlorContext(self.dbname) as sor:
d = await sor.I(self.tablename)
return json_response(Success(d))
except Exception as e:
print(e)
traceback.print_exc()
return json_response(Error(errno='metaerror',msg='get metadata error'))
async def get(self) -> Response:
"""
query data
"""
try:
ns = multiDict2Dict(self.request.query)
with self.db.sqlorContext(self.dbname) as sor:
d = await sor.R(self.tablename, ns)
return json_response(Success(d))
except Exception as e:
print(e)
traceback.print_exc()
return json_response(Error(errno='search error',msg='search error'))
async def post(self):
"""
insert data
"""
try:
ns = multiDict2Dict(await self.request.post())
with self.db.sqlorContext(self.dbname) as sor:
d = await sor.C(self.tablename, ns)
return json_response(Success(d))
except Exception as e:
print(e)
traceback.print_exc()
return json_response(Error(errno='add error',msg='add error'))
async def put(self):
"""
update data
"""
try:
ns = multiDict2Dict(await self.request.post())
with self.db.sqlorContext(self.dbname) as sor:
d = await sor.U(self.tablename, ns)
return json_response(Success(' '))
except Exception as e:
print(e)
traceback.print_exc()
return json_response(Error(errno='update error',msg='update error'))
async def delete(self, request: Request, instance_id):
"""
delete data
"""
try:
ns = multiDict2Dict(self.request.query)
with self.db.sqlorContext(self.dbname) as sor:
d = await sor.D(self.tablename, ns)
return json_response(Success(d))
except Exception as e:
print(e)
traceback.print_exc()
return json_response(Error(erron='delete error',msg='error'))

25
ahserver/serverenv.py Executable file
View File

@ -0,0 +1,25 @@
from appPublic.Singleton import SingletonDecorator
from appPublic.dictObject import DictObject
@SingletonDecorator
class ServerEnv(DictObject):
pass
clientkeys = {
"iPhone":"iphone",
"iPad":"ipad",
"Android":"androidpad",
"Windows Phone":"winphone",
"Windows NT[.]*Win64; x64":"pc",
}
def getClientType(request):
agent = request.headers.get('user-agent')
for k in clientkeys.keys():
m = re.findall(k,agent)
if len(m)>0:
return clientkeys[k]
return 'pc'

75
ahserver/sqldsProcessor.py Executable file
View File

@ -0,0 +1,75 @@
import codecs
from .dsProcessor import DataSourceProcessor
from appPublic.jsonConfig import getConfig
from sqlor.dbpools import DBPools
import json
"""
sqlds file format:
{
"sqldesc":{
"sql_string":"select * from dbo.stock_daily_hist where stock_num=${stock_num}$ order by trade_date desc",
"db":"mydb",
"sortfield":"stock_date"
}
"arguments":[
{
"name":"stock_num",
"type":"str",
"iotype":"text",
"default":"600804"
}
],
"datadesc":[
{
}
]
}
"""
class SQLDataSourceProcessor(DataSourceProcessor):
@classmethod
def isMe(self,name):
return name=='sqlds'
def getArgumentsDesc(self,dict_data,ns,request):
desc = dict_data.get('arguments',None)
return desc
async def getDataDesc(self,dict_data,ns,request):
pool = DBPools()
@pool.runSQLResultFields
def sql(dbname,NS):
sqldesc = dict_data.get('sqldesc')
# print('sql(),sqldesc=',sqldesc)
return sqldesc
rec = dict_data.get('datadesc',None)
if rec is None:
sqldesc = dict_data.get('sqldesc')
ns = dict_data.get('arguments',{})
rec = [ r for r in sql(sqldesc['db'],ns) if r['name']!='_row_id' ]
dict_data['datadesc'] = rec
f = codecs.open(self.src_file,'w',self.config.website.coding)
b = json.dumps(dict_data,indent=4)
f.write(b)
f.close()
return rec
async def getData(self,dict_data,ns,request):
pool = DBPools()
@pool.runSQL
def sql(dbname,NS):
sqldesc = dict_data.get('sqldesc')
return sqldesc
db = dict_data['sqldesc']['db']
ret = [ i for i in await sql(db,ns) ]
return ret
async def getPagingData(self,dict_data,ns,request):
pool = DBPools()
@pool.runSQLPaging
def sql(dbname,NS):
sqldesc = dict_data.get('sqldesc')
return sqldesc
db = dict_data['sqldesc']['db']
ret = await sql(db,ns)
return ret

83
ahserver/uriop.py Executable file
View File

@ -0,0 +1,83 @@
#
import os
import codecs
from appPublic.jsonConfig import getConfig
from appPublic.folderUtils import folderInfo
class URIopException(Exception):
def __init__(self,errtype,errmsg):
self.errtype = errtype
self.errmsg = errmsg
super(URIopException,self).init('errtype=%s,errmsg=%s' % (errtype,errmsg))
def __str__(self):
return 'errtype=%s,errmsg=%s' % (self.errtype,self.errmsg)
class URIOp(object):
def __init__(self):
self.conf = getConfig()
self.realPath = os.path.abspath(self.conf.website.root)
def abspath(self,uri=None):
p = self.conf.website.root
if uri is not None and len(uri)>0:
x = uri
if x[0] == '/':
x = x[1:]
p = os.path.join(p,*x.split('/'))
d = os.path.abspath(p)
if len(d) < len(self.realPath):
raise URIopException('url scope error',uri);
if d[:len(self.realPath)] != self.realPath:
raise URIopException('url scope error',uri);
return d
def fileList(self,uri=''):
r = [ i for i in folderInfo(self.realPath,uri) ]
for i in r:
if i['type']=='dir':
i['state'] = 'closed'
i['id'] = '_#_'.join(i['id'].split('/'))
ret={
'total':len(r),
'rows':r
}
return ret
def mkdir(self,at_uri,name):
p = self.abspath(at_uri)
p = os.path.join(p,name)
os.mkdir(p)
def rename(self,uri,newname):
p = self.abspath(uri)
dir = os.path.dirname(p)
np = os.path.join(p,newname)
os.rename(p,np)
def delete(self,uri):
p = self.abspath(uri)
os.remove(p)
def save(self,uri,data):
p = self.abspath(uri)
f = codecs.open(p,"w",self.conf.website.coding)
f.write(data)
f.close()
def read(self,uri):
p = self.abspath(uri)
f = codecs.open(p,"r",self.conf.website.coding)
b = f.read()
f.close()
return b
def write(self,uri,data):
p = self.abspath(uri)
f = codecs.open(p,"w",self.conf.website.coding)
f.write(data)
f.close()

114
ahserver/url2file.py Executable file
View File

@ -0,0 +1,114 @@
import os
class Url2File:
def __init__(self,path:str,prefix: str,
indexes: list, inherit: bool=False):
self.rootpath = path
self.starts = prefix
self.indexes = indexes
self.inherit = inherit
def realurl(self,url:str) -> str :
items = url.split('/')
items = [ i for i in items if i != '.' ]
while '..' in items:
for i,v in enumerate(items):
if v=='..' and i > 0:
del items[i]
del items[i-1]
break
return '/'.join(items)
def url2ospath(self, url: str) -> str:
url = url.split('?')[0]
if len(url) > 0 and url[-1] == '/':
url = url[:-1]
paths = url.split('/')
if url.startswith('http://') or \
url.startswith('https://') or \
url.startswith('ws://') or \
url.startswith('wss://'):
paths = paths[3:]
f = os.path.join(self.rootpath,*paths)
real_path = os.path.abspath(f)
# print(f'{real_path=}, {url=}, {f=}')
return real_path
def url2file(self, url: str) -> str:
ourl = url
url = url.split('?')[0]
real_path = self.url2ospath(url)
if os.path.isdir(real_path):
for idx in self.indexes:
p = os.path.join(real_path,idx)
if os.path.isfile(p):
# print(f'{url=}, {real_path=}, {idx=}, {p=}')
return p
if os.path.isfile(real_path):
return real_path
if not os.path.isdir(os.path.dirname(real_path)):
# print(f'url2file() return None, {real_path=}, {url=},{ourl=}, {self.rootpath=}')
return None
if not self.inherit:
# print(f'url2file() return None, self.inherit is false, {url:}, {self.rootpath=}')
return None
items = url.split('/')
if len(items) > 2:
del items[-2]
oldurl = url
url = '/'.join(items)
# print(f'{oldurl=}, {url=}')
return self.url2file(url)
# print(f'url2file() return None finally, {items:}, {url=}, {ourl=}, {self.rootpath=}')
return None
def relatedurl(self,url: str, name: str) -> str:
if len(url) > 0 and url[-1] == '/':
url = url[:-1]
fp = self.url2ospath(url)
if os.path.isfile(fp):
items = url.split('/')
del items[-1]
url = '/'.join(items)
url = url + '/' + name
return self.realurl(url)
def relatedurl2file(self,url: str, name: str):
url = self.relatedurl(url,name)
return self.url2file(url)
class TmplUrl2File:
def __init__(self,paths,indexes, subffixes=['.tmpl','.ui' ],inherit=False):
self.paths = paths
self.u2fs = [ Url2File(p,prefix,indexes,inherit=inherit) \
for p,prefix in paths ]
self.subffixes = subffixes
def url2file(self,url):
for u2f in self.u2fs:
fp = u2f.url2file(url)
if fp:
return fp
return None
def relatedurl(self,url: str, name: str) -> str:
for u2f in self.u2fs:
fp = u2f.relatedurl(url, name)
if fp:
return fp
return None
def list_tmpl(self):
ret = []
for rp,_ in self.paths:
p = os.path.abspath(rp)
[ ret.append(i) for i in listFile(p,suffixs=self.subffixes,rescursive=True) ]
return sorted(ret)

4
ahserver/utils.py Executable file
View File

@ -0,0 +1,4 @@
def unicode_escape(s):
x = [ch if ord(ch) < 256 else ch.encode('unicode_escape').decode('utf-8') for ch in s]
return ''.join(x)

1
ahserver/version.py Executable file
View File

@ -0,0 +1 @@
__version__ = '0.3.4'

195
ahserver/websocketProcessor.py Executable file
View File

@ -0,0 +1,195 @@
import asyncio
import aiohttp
import aiofiles
import json
import codecs
from aiohttp import web
import aiohttp_cors
from traceback import print_exc
from appPublic.sshx import SSHNode
from appPublic.log import info, debug, warning, error, exception, critical
from .baseProcessor import BaseProcessor, PythonScriptProcessor
class XtermProcessor(PythonScriptProcessor):
@classmethod
def isMe(self,name):
return name=='xterm'
async def ws_2_process(self, ws):
async for msg in ws:
if msg.type == aiohttp.WSMsgType.TEXT:
self.p_obj.stdin.write(msg.data)
elif msg.type == aiohttp.WSMsgType.ERROR:
# print('ws connection closed with exception %s' % ws.exception())
return
async def process_2_ws(self, ws):
while self.running:
x = await self.p_obj.stdout.read(1024)
await self.ws_sendstr(ws, x)
async def datahandle(self,request):
await self.path_call(request)
async def path_call(self, request, params={}):
#
# xterm file is a python script as dspy file
# it must return a DictObject with sshnode information
# parameters: nodeid
#
login_info = super().path_call(request, params=params)
ws = web.WebSocketResponse()
await ws.prepare(request)
await self.create_process(login_info)
self.ws_sendstr(ws, 'Welcom to sshclient')
r1 = self.ws_2_process(ws)
r2 = self.process_2_ws(ws)
await asyncio.gather(r1,r2)
self.retResponse = ws
return ws
async def get_login_info(self):
async with aiofiles.open(self.real_path, 'r', encoding='utf-8') as f:
txt = await f.read()
self.login_info = json.loads(txt)
# print(f'{self.login_info=}')
async def create_process(self, lgoin_info):
# id = lenv['params_kw'].get('termid')
host = login_info['host']
port = login_info.get('port', 22)
username = login_info.get('username', 'root')
password = login_info.get('password',None)
self.sshnode = SSHNode(host, username=username,
password=password,
port=port)
await self.sshnode.connect()
self.p_obj = await self.sshnode._process('bash',
term_type='vt100',
term_size=(80, 24),
encoding='utf-8')
self.running = True
async def ws_sendstr(self, ws:web.WebSocketResponse, s:str):
data = {
"type":1,
"data":s
}
await ws.send_str(json.dumps(data))
def close_process(self):
self.sshnode.close()
self.p_obj.close()
async def ws_send(ws:web.WebSocketResponse, data):
info(f'data={data} {ws=}')
d = {
"type":1,
"data":data
}
d = json.dumps(d)
try:
return await ws.send_str(d)
except Exception as e:
exception(f'ws.send_str() error: {e=}')
print_exc()
return False
class WsPool:
def __init__(self, ws, ws_path, app):
self.app = app
self.id = None
self.ws = ws
self.ws_path = ws_path
def get_data(self):
return self.app.get_data(self.ws_path)
def set_data(self, data):
self.app.set_data(self.ws_path, data)
def is_online(self, userid):
data = self.get_data()
ws = data.get(userid)
if ws is None:
return False
return True
def add_me(self, iddata):
data = self.get_data()
if data is None:
data = {}
iddata.ws = self.ws
self.id = iddata.id
data.update({self.id:iddata})
info(f'add_me() {data=}')
self.set_data(data)
def delete_id(self, id):
data = self.get_data()
if not data.get(id):
return
data = {k:v for k,v in data.items() if k != id }
self.set_data(data)
def delete_me(self):
self.delete_id(self.id)
data = self.get_data()
if not data.get(self.id):
return
data = {k:v for k,v in data.items() if k != self.id }
self.set_data(data)
async def sendto(self, data, id=None):
if id is None:
return await ws_send(self.ws, data)
d = self.get_data()
iddata = d.get(id)
info(f'{id=} {d=}, {iddata=}')
ws = iddata.ws
try:
return await ws_send(ws, data)
except:
self.delete_id(id)
class WebsocketProcessor(PythonScriptProcessor):
@classmethod
def isMe(self,name):
return name=='ws'
async def path_call(self, request,params={}):
await self.set_run_env(request)
lenv = self.run_ns.copy()
lenv.update(params)
params_kw = lenv.params_kw
userid = lenv.params_kw.userid or await lenv.get_user()
del lenv['request']
txt = await self.loadScript(self.real_path)
ws = web.WebSocketResponse()
try:
await ws.prepare(request)
except Exception as e:
print('--------except:', e, type(e))
print_exc()
raise e
ws_pool = WsPool(ws, request.path, request.app)
async for msg in ws:
if msg.type == aiohttp.WSMsgType.TEXT:
if msg.data == 'exit':
break
lenv['ws_data'] = msg.data
lenv['ws_pool'] = ws_pool
exec(txt,lenv,lenv)
func = lenv['myfunc']
resp = await func(request,**lenv)
elif msg.type == aiohttp.WSMsgType.ERROR:
info('ws connection closed with exception %s' % ws.exception())
break
else:
info('datatype error', msg.type)
info(f'========== ws connection end ===========')
ws_pool.delete_me()
self.retResponse = ws
await ws.close()
return ws

128
ahserver/xlsxData.py Executable file
View File

@ -0,0 +1,128 @@
from openpyxl import load_workbook
import json
"""
xlsxds file format:
{
"xlsxfile":"./data.xlsx",
"data_from":7,
"data_sheet":"Sheet1",
"label_at",1,
"name_at":null,
"datatype_at":2,
"ioattrs_at":3,
"listhide_at":4,
"inputhide_at":5,
"frozen_at":6
}
"""
class XLSXData:
def __init__(self,path,desc):
self.desc = desc
self.xlsxfile = path
self.workbook = load_workbook(self.xlsxfile)
self.ws = self.workbook[self.desc['data_sheet']]
def getBaseFieldsInfo(self):
ws = self.workbook[self.desc['data_sheet']]
ret = []
for y in range(1,ws.max_column+1):
r = {
'name':self._fieldName(ws,y),
'label':self._fieldLabel(ws,y),
'type':self._fieldType(ws,y),
'listhide':self._isListHide(ws,y),
'inputhide':self._isInputHide(ws,y),
'frozen':self._isFrozen(ws,y)
}
r.update(self._fieldIOattrs(ws,y))
ret.append(r)
return ret
def _fieldName(self,ws,i):
x = self.desc.get('name_at')
if x is not None:
return ws.cell(x,i).value
return 'f' + str(i)
def _fieldLabel(self,ws,i):
x = self.desc.get('label_at',1)
if x is not None:
return ws.cell(x,i).value
return 'f' + str(i)
def _fieldType(self,ws,i):
x = self.desc.get('datatype_at')
if x is not None:
return ws.cell(x,i).value
return 'str'
def _fieldIOattrs(self,ws,i):
x = self.desc.get('ioattrs_at')
if x is not None:
t = ws.cell(x,i).value
if t is not None:
try:
return json.loads(t,'utf-8')
except Exception as e:
print('xlsxData.py:field=',i,'t=',t,'error')
return {}
def _isFrozen(self,ws,i):
x = self.desc.get('frozen_at')
if x is not None:
t = ws.cell(x,y).value
if t == 'Y' or t == 'y':
return True
return False
def _isListHide(self,ws,i):
x = self.desc.get('listhide_at')
if x is not None:
t = ws.cell(x,i).value
if t == 'Y' or t == 'y':
return True
return False
def _isInputHide(self,ws,i):
x = self.desc.get('inputhide_at')
if x is not None:
t = ws.cell(x,i).value
if t == 'Y' or t == 'y':
return True
return False
def getPeriodData(self,min_r,max_r):
ws = self.ws
rows = []
assert(min_r >= self.desc.get('data_from',2))
if max_r > ws.max_row:
max_r = ws.max_row + 1;
if min_r <= max_r:
x = min_r;
while x < max_r:
d = {}
for y in range(1,ws.max_column+1):
name = self._fieldName(ws,y)
d.update({name:ws.cell(column=y,row=x).value})
rows.append(d)
x = x + 1
return rows
def getArgumentsDesc(self,ns,request):
return None
def getData(self,ns):
ws = self.ws
min_r = self.desc.get('data_from',2)
return self.getPeriodData(min_r,ws.max_row + 1)
def getPagingData(self,ns):
rows = int(ns.get('rows',50))
page = int(ns.get('page',1))
d1 = self.desc.get('data_from',2)
min_r = (page - 1) * rows + d1
max_r = page * rows + d1 + 1
rows = self.getPeriodData(min_r,max_r)
ret = {
'total':self.ws.max_row - d1,
'rows':rows
}
return ret

51
ahserver/xlsxdsProcessor.py Executable file
View File

@ -0,0 +1,51 @@
import codecs
from openpyxl import load_workbook
from appPublic.jsonConfig import getConfig
from .dsProcessor import DataSourceProcessor
from .xlsxData import XLSXData
"""
xlsxds file format:
{
"xlsxfile":"./data.xlsx",
"data_from":7,
"data_sheet":"Sheet1",
"label_at",1,
"name_at":null,
"datatype_at":2,
"ioattrs":3,
"listhide_at":4,
"inputhide_at":5,
"frozen_at":6
}
"""
class XLSXDataSourceProcessor(DataSourceProcessor):
@classmethod
def isMe(self,name):
return name=='xlsxds'
def getArgumentsDesc(self,dict_data,ns,request):
return None
async def getDataDesc(self,dict_data,ns,request):
path = dict_data.get('xlsxfile',None)
self.xlsxdata = XLSXData(self.g.abspath(self.g.absurl(request,path)),dict_data)
ret = self.xlsxdata.getBaseFieldsInfo(ns)
return ret
async def getData(self,dict_data,ns,request):
path = dict_data.get('xlsxfile',None)
self.xlsxdata = XLSXData(self.g.abspath(self.g.absurl(request,path)),dict_data)
ret = self.xlsxdata.getData(ns)
return ret
async def getPagingData(self,dict_data,ns,request):
path = dict_data.get('xlsxfile',None)
self.xlsxdata = XLSXData(self.g.abspath(ns.absurl(request,path)),dict_data)
ret = self.xlsxdata.getPagingData(ns)
return ret

View File

@ -0,0 +1 @@
ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEAzt2GN0Y6T3JalHmQQWqE+Ag+uPbJbWEAMLr5cSEL+tZ0pNFA+LzXQvmVWPbP2ohr2ry9VF8Eng4Xt44Tq0XLZQohBnQc5xUYd65Gc/YR9CIweUrkjKykW9aZtkitk6+7bYLzaT3qp6Szj6ZJ8lAT2Q4gOg+gIgOWmn9oaVo3ZzWGPuKUEHG9rAA7VeKRaZoifbL3zTgPb4l7oG8Mr0S84cF58RPPaD/4SDvRV+l1jBZXq5FlXGVvW07q8g2O+iWpa8Y8HJwYez9GT7FxqvPG8a3D12Mh6xGzIWJcOhM/T/ZvRGQSg4CrzbUHoygt/HkgmNEAna/XJW+7BFoPKyCLWw==

4
build.sh Executable file
View File

@ -0,0 +1,4 @@
rm dist/*.whl
python setup.py install
python setup.py bdist_wheel
python -m twine upload --repository-url https://upload.pypi.org/legacy/ dist/*.whl

View File

View File

@ -0,0 +1,163 @@
import time
import uuid
from aiohttp_auth import auth
from aiohttp_auth.auth.ticket_auth import TktAuthentication
from aiohttp_session.redis_storage import RedisStorage
from os import urandom
from aiohttp import web
import aiohttp_session
import aioredis
import base64
import binascii
from aiohttp_session import get_session, session_middleware, Session
from aiohttp_session.cookie_storage import EncryptedCookieStorage
from aiohttp_session.redis_storage import RedisStorage
from appPublic.jsonConfig import getConfig
from appPublic.rsawrap import RSA
from appPublic.log import info, debug, warning, error, critical, exception
def get_client_ip(obj, request):
ip = request.headers.get('X-Forwarded-For')
if not ip:
ip = request.remote
request['client_ip'] = ip
return ip
async def get_session_user(request):
userid = await auth.get_auth(request)
return userid
async def user_login(request, userid):
await auth.remember(request, userid)
async def user_logout(request):
await auth.forget(request)
class MyRedisStorage(RedisStorage):
def key_gen(self, request):
key = request.headers.get('client_uuid')
if not key:
key = uuid.uuid4().hex
return key
if isinstance(key, str):
key = key.encode('utf-8')
key = binascii.hexlify(key)
key = key.decode('utf-8')
return key
async def save_session(self, request: web.Request,
response: web.StreamResponse,
session: Session) -> None:
key = session.identity
if key is None:
key = self.key_gen(request)
self.save_cookie(response, key, max_age=session.max_age)
else:
if session.empty:
self.save_cookie(response, "", max_age=session.max_age)
else:
key = str(key)
self.save_cookie(response, key, max_age=session.max_age)
data_str = self._encoder(self._get_session_data(session))
await self._redis.set(
self.cookie_name + "_" + key,
data_str,
ex=session.max_age,
)
class AuthAPI:
def __init__(self):
self.conf = getConfig()
async def checkUserPermission(self, user, path):
# print('************* checkUserPermission() use default one ****************')
return True
def getPrivateKey(self):
if not hasattr(self,'rsaEngine'):
self.rsaEngine = RSA()
fname = self.conf.website.rsakey.privatekey
self.privatekey = self.rsaEngine.read_privatekey(fname)
return self.privatekey
def rsaDecode(self,cdata):
self.getPrivateKey()
return self.rsaEngine.decode(self.privatekey,cdata)
async def setupAuth(self,app):
# setup session middleware in aiohttp fashion
b = str(self.conf.website.port).encode('utf-8')
cnt = 32 - len(b)
secret = b + b'iqwertyuiopasdfghjklzxcvbnm12345'[:cnt]
storage = EncryptedCookieStorage(secret)
if self.conf.website.session_redis:
url = self.conf.website.session_redis.url
# redis = await aioredis.from_url("redis://127.0.0.1:6379")
redis = await aioredis.from_url(url)
storage = MyRedisStorage(redis)
aiohttp_session.setup(app, storage)
# Create an auth ticket mechanism that expires after 1 minute (60
# seconds), and has a randomly generated secret. Also includes the
# optional inclusion of the users IP address in the hash
session_max_time = 120
session_reissue_time = 30
if self.conf.website.session_max_time:
session_max_time = self.conf.website.session_max_time
if self.conf.website.session_reissue_time:
session_reissue_time = self.conf.website.session_reissue_time
def _new_ticket(self, request, user_id):
client_uuid = request.headers.get('client_uuid')
ip = self._get_ip(request)
valid_until = int(time.time()) + self._max_age
# print(f'hack: my _new_ticket() called ... remote {ip=}, {client_uuid=}')
return self._ticket.new(user_id,
valid_until=valid_until,
client_ip=ip,
user_data=client_uuid)
TktAuthentication._get_ip = get_client_ip
TktAuthentication._new_ticket = _new_ticket
policy = auth.SessionTktAuthentication(secret,
session_max_time,
reissue_time=session_reissue_time,
include_ip=True)
# setup aiohttp_auth.auth middleware in aiohttp fashion
# print('policy = ', policy)
auth.setup(app, policy)
app.middlewares.append(self.checkAuth)
@web.middleware
async def checkAuth(self,request,handler):
info(f'checkAuth() called ... {request.path=}')
t1 = time.time()
path = request.path
user = await auth.get_auth(request)
is_ok = await self.checkUserPermission(user, path)
t2 = time.time()
ip = get_client_ip(None, request)
if is_ok:
try:
ret = await handler(request)
t3 = time.time()
info(f'timecost=client({ip}) {user} access {path} cost {t3-t1}, ({t2-t1})')
return ret
except Exception as e:
t3 = time.time()
info(f'Exception=client({ip}) {user} access {path} cost {t3-t1}, ({t2-t1}), except={e}')
raise e
if user is None:
info(f'timecost=client({ip}) {user} access need login to access {path} ({t2-t1})')
raise web.HTTPUnauthorized
info(f'timecost=client({ip}) {user} access {path} forbidden ({t2-t1})')
raise web.HTTPForbidden()
async def needAuth(self,path):
return False

View File

@ -0,0 +1,243 @@
import os
import re
import json
import codecs
import aiofiles
from aiohttp.web_request import Request
from aiohttp.web_response import Response, StreamResponse
from appPublic.jsonConfig import getConfig
from appPublic.dictObject import DictObject
from appPublic.folderUtils import listFile
from appPublic.log import info, debug, warning, error, critical, exception
from .utils import unicode_escape
from .serverenv import ServerEnv
from .filetest import current_fileno
class ObjectCache:
def __init__(self):
self.cache = {}
def store(self,path,obj):
o = self.cache.get(path,None)
if o is not None:
try:
del o.cached_obj
except:
pass
o = DictObject()
o.cached_obj = obj
o.mtime = os.path.getmtime(path)
self.cache[path] = o
def get(self,path):
o = self.cache.get(path)
if o:
if os.path.getmtime(path) > o.mtime:
return None
return o.cached_obj
return None
class BaseProcessor:
@classmethod
def isMe(self,name):
return name=='base'
def __init__(self,path,resource):
self.env_set = False
self.path = path
self.resource = resource
self.retResponse = None
# self.last_modified = os.path.getmtime(path)
# self.content_length = os.path.getsize(path)
self.headers = {
'Content-Type': 'text/html; utf-8',
'Accept-Ranges': 'bytes'
}
self.content = ''
async def be_call(self, request, params={}):
return await self.path_call(request, params=params)
async def set_run_env(self, request, params={}):
if self.env_set:
return
self.real_path = self.resource.url2file(request.path)
g = ServerEnv()
self.run_ns = DictObject()
self.run_ns.update(g)
self.run_ns.update(self.resource.y_env)
self.run_ns['request'] = request
self.run_ns['app'] = request.app
kw = await self.run_ns['request2ns']()
kw.update(params)
self.run_ns['params_kw'] = kw
self.run_ns.update(kw)
self.run_ns['ref_real_path'] = self.real_path
self.run_ns['processor'] = self
self.env_set = True
async def execute(self,request):
await self.set_run_env(request)
await self.datahandle(request)
return self.content
def set_response_headers(self, response):
response.headers['Access-Control-Expose-Headers'] = 'Set-Cookie'
# response.headers['Access-Control-Allow-Credentials'] = 'true'
# response.headers['Access-Control-Allow-Origin'] = '47.93.12.75'
async def handle(self,request):
await self.execute(request)
jsonflg = False
if self.retResponse is not None:
self.set_response_headers(self.retResponse)
return self.retResponse
elif isinstance(self.content, Response):
return self.content
elif isinstance(self.content, StreamResponse):
return self.content
elif isinstance(self.content, DictObject):
self.content = json.dumps(self.content, indent=4)
jsonflg = True
elif isinstance(self.content, dict):
self.content = json.dumps(self.content, indent=4)
jsonflg = True
elif isinstance(self.content, list):
self.content = json.dumps(self.content, indent=4)
jsonflg = True
elif isinstance(self.content, tuple):
self.content = json.dumps(self.content, indent=4)
jsonflg = True
elif isinstance(self.content, bytes):
self.headers['Access-Control-Expose-Headers'] = 'Set-Cookie'
self.headers['Content-Length'] = str(len(self.content))
resp = Response(body=self.content,headers=self.headers)
self.set_response_headers(resp)
return resp
else:
try:
json.loads(self.content)
jsonflg = True
except:
pass
if jsonflg:
self.headers['Content-Type'] = "application/json; utf-8"
self.headers['Access-Control-Expose-Headers'] = 'Set-Cookie'
resp = Response(text=self.content,headers=self.headers)
self.set_response_headers(resp)
return resp
async def datahandle(self,request):
debug('*******Error*************')
self.content=''
def setheaders(self):
pass
# self.headers['Content-Length'] = str(len(self.content))
class TemplateProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='tmpl'
async def path_call(self, request, params={}):
await self.set_run_env(request, params=params)
path = request.path
ns = self.run_ns
te = self.run_ns['tmpl_engine']
return await te.render(path,**ns)
async def datahandle(self,request):
self.content = await self.path_call(request)
def setheaders(self):
super(TemplateProcessor,self).setheaders()
if self.path.endswith('.tmpl.css'):
self.headers['Content-Type'] = 'text/css; utf-8'
elif self.path.endswith('.tmpl.js'):
self.headers['Content-Type'] = 'application/javascript ; utf-8'
else:
self.headers['Content-Type'] = 'text/html; utf-8'
class BricksUIProcessor(TemplateProcessor):
@classmethod
def isMe(self,name):
# print(f'{name=} is a bui')
return name=='bui'
async def datahandle(self, request):
params = await self.resource.y_env['request2ns']()
await super().datahandle(request)
if params.get('_webbricks_',None):
return
txt = self.content
entire_url = self.run_ns.get('entire_url')
content0 = await self.resource.path_call(request,entire_url('/bricks/header.tmpl'))
content2 = await self.resource.path_call(request,entire_url('/bricks/footer.tmpl'))
self.content = '%s%s%s' % (content0, txt, content2)
class PythonScriptProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='dspy'
async def loadScript(self, path):
data = ''
async with aiofiles.open(path,'r', encoding='utf-8') as f:
data = await f.read()
b= ''.join(data.split('\r'))
lines = b.split('\n')
lines = ['\t' + l for l in lines ]
txt = "async def myfunc(request,**ns):\n" + '\n'.join(lines)
return txt
async def path_call(self, request,params={}):
await self.set_run_env(request, params=params)
lenv = self.run_ns
del lenv['request']
txt = await self.loadScript(self.real_path)
# print(self.real_path, "#########", txt)
exec(txt,lenv,lenv)
func = lenv['myfunc']
return await func(request,**lenv)
async def datahandle(self,request):
self.content = await self.path_call(request)
class MarkdownProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='md'
async def datahandle(self,request:Request):
data = ''
async with aiofiles.open(self.real_path,'r',encoding='utf-8') as f:
data = await f.read()
self.content = self.urlreplace(data, request)
def urlreplace(self,mdtxt,request):
p = '\[(.*)\]\((.*)\)'
return re.sub(p,
lambda x:'['+x.group(1)+'](' + self.resource.entireUrl(request, x.group(2)) + ')',
mdtxt)
def getProcessor(name):
# print(f'getProcessor({name})')
return _getProcessor(BaseProcessor, name)
def _getProcessor(kclass,name):
for k in kclass.__subclasses__():
if not hasattr(k,'isMe'):
continue
if k.isMe(name):
return k
a = _getProcessor(k,name)
if a is not None:
return a
return None

View File

@ -0,0 +1,91 @@
import os,sys
from sys import platform
import time
import ssl
from socket import *
from aiohttp import web
from appPublic.folderUtils import ProgramPath
from appPublic.dictObject import DictObject
from appPublic.jsonConfig import getConfig
from appPublic.log import info, debug, warning, error, critical, exception
from appPublic.registerfunction import RegisterCoroutine
from sqlor.dbpools import DBPools
from .processorResource import ProcessorResource
from .auth_api import AuthAPI
from .myTE import setupTemplateEngine
from .globalEnv import initEnv
from .filestorage import TmpFileRecord
from .loadplugins import load_plugins
class AHApp(web.Application):
def __init__(self, *args, **kw):
kw['client_max_size'] = 1024000000
super().__init__(*args, **kw)
self.data = DictObject()
def set_data(self, k, v):
self.data[k] = v
def get_data(self, k):
return self.data.get(k, DictObject())
class ConfiguredServer:
def __init__(self, auth_klass=AuthAPI, workdir=None):
self.auth_klass = auth_klass
self.workdir = workdir
if self.workdir is not None:
pp = ProgramPath()
config = getConfig(self.workdir,
{'workdir':self.workdir,'ProgramPath':pp})
else:
config = getConfig()
if config.databases:
DBPools(config.databases)
self.config = config
initEnv()
setupTemplateEngine()
client_max_size = 1024 * 10240
if config.website.client_max_size:
client_max_size = config.website.client_max_size
self.app = AHApp(client_max_size=client_max_size)
load_plugins(self.workdir)
async def build_app(self):
rf = RegisterCoroutine()
await rf.exe('ahapp_built', self.app)
auth = self.auth_klass()
await auth.setupAuth(self.app)
return self.app
def run(self, port=None):
config = getConfig()
self.configPath(config)
a = TmpFileRecord()
ssl_context = None
if port is None:
port = config.website.port or 8080
if config.website.ssl:
ssl_context = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)
ssl_context.load_cert_chain(config.website.ssl.crtfile,
config.website.ssl.keyfile)
reuse_port = None
if platform != 'win32':
reuse_port = True
print('reuse_port=', reuse_port)
web.run_app(self.build_app(),host=config.website.host or '0.0.0.0',
port=port,
reuse_port=reuse_port,
ssl_context=ssl_context)
def configPath(self,config):
for p,prefix in config.website.paths:
res = ProcessorResource(prefix,p,show_index=True,
follow_symlinks=True,
indexes=config.website.indexes,
processors=config.website.processors)
self.app.router.register_resource(res)

View File

@ -0,0 +1,57 @@
import os
import re
import traceback
from aiohttp.web_response import Response
from aiohttp.web_exceptions import (
HTTPException,
HTTPExpectationFailed,
HTTPForbidden,
HTTPMethodNotAllowed,
HTTPNotFound,
)
from aiohttp import web
from aiohttp.web_request import Request
from aiohttp.web_routedef import AbstractRouteDef
from aiohttp.web import json_response
from sqlor.crud import CRUD
from appPublic.dictObject import multiDict2Dict
from appPublic.jsonConfig import getConfig
from appPublic.log import info, debug, warning, error, critical, exception
from .error import Error,Success
actions = [
"browse",
"add",
"update",
"filter"
]
class DBAdmin:
def __init__(self, request,dbname,tablename, action):
self.dbname = dbname
self.tablename = tablename
self.request = request
self.action = action
if action not in actions:
debug('action not defined:%s' % action)
raise HTTPNotFound
try:
self.crud = CRUD(dbname,tablename)
except Exception as e:
exception('e= %s' % e)
traceback.print_exc()
raise HTTPNotFound
async def render(self) -> Response:
try:
d = await self.crud.I()
return json_response(Success(d))
except Exception as e:
exception('except=%s' % e)
traceback.print_exc()
return json_response(Error(errno='metaerror',msg='get metadata error'))

View File

@ -0,0 +1,67 @@
import codecs
import json
import aiofiles
from appPublic.jsonConfig import getConfig
from appPublic.dictObject import DictObject
from .baseProcessor import BaseProcessor
from .serverenv import ServerEnv
class DataSourceProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='ds'
def __init__(self,filename,k):
super(DataSourceProcessor,self).__init__(filename,k)
self.actions = {
'getdata':self.getData,
'pagingdata':self.getPagingData,
'arguments':self.getArgumentsDesc,
'resultFields':self.getDataDesc,
'gridlist':self.getGridlist,
}
self.g = ServerEnv()
async def getData(self,dict_data,ns,request):pass
async def getPagingData(self,dict_data,ns,request):pass
async def getArgumentsDesc(self,dict_data,ns,request):pass
async def getDataDesc(self,dict_data,ns,request):pass
async def getGridlist(self,dict_data,ns,request):
ret = self.getDataDesc(dict_data,ns,request)
ffs = [ f for f in ret if f.get('frozen',False) ]
fs = [ f for f in ret if not f['frozen'] ]
[ f.update({'hide':True}) for f in ffs if f.get('listhide',False) ]
[ f.update({'hide':True}) for f in fs if f.get('listhide') ]
d = {
"iconCls":"icon-search",
"url":self.resource.absUrl(request,request.path + '?action=pagingdata'),
"view":"bufferview",
"options":{
"pageSize":50,
"pagination":False
}
}
d.update({'fields':fs})
if len(ffs)>0:
d.update({'ffields':ffs})
ret = {
"__ctmpl__":"datagrid",
"data":d
}
return ret
async def path_call(self, request, path):
dict_data = {}
config = getConfig()
async with aiofiles.open(path,'r',encoding=config.website.coding) as f:
b = await f.read()
dict_data = json.loads(b)
ns = self.run_ns
act = ns.get('action','getdata')
action = self.actions.get(act)
return await action(dict_data,ns,request)
async def datahandle(self,request):
self.content = await path_call(request, self.path)

View File

@ -0,0 +1,27 @@
def Error(errno='undefined error',msg='Error'):
return {
"status":"Error",
"data":{
"message":msg,
"errno":errno
}
}
def Success(data):
return {
"status":"OK",
"data":data
}
def NeedLogin(path):
return {
"status":"need_login",
"data":path
}
def NoPermission(path):
return {
"status":"no_permission",
"data":path
}

View File

@ -0,0 +1,54 @@
import os
import asyncio
import mimetypes
from aiohttp.web_exceptions import HTTPNotFound
from aiohttp.web import StreamResponse
from aiohttp import web
import aiofiles
from appPublic.rc4 import RC4
crypto_aim = 'God bless USA and others'
def path_encode(path):
rc4 = RC4()
return rc4.encode(path,crypto_aim)
def path_decode(dpath):
rc4 = RC4()
return rc4.decode(dpath,crypto_aim)
async def file_upload(request):
pass
async def file_download(request, filepath, content_type=None):
filename = os.path.basename(filepath)
r = web.FileResponse(filepath)
ct = content_type
if ct is None:
ct, encoding = mimetypes.guess_type(filepath)
if ct is not None:
r.content_type = ct
else:
r.content_type = 'application/octet-stream'
r.content_disposition = 'attachment; filename=%s' % filename
r.enable_compression()
return r
if os.path.exists(filepath):
length = os.path.getsize(filepath)
response = web.Response(
status=200,
headers = {
'Content-Disposition': 'attrachment;filename={}'.format(filename)
}
)
await response.prepare(request)
cnt = 0
async with aiofiles.open(filepath, 'rb') as f:
chunk = await f.read(10240000)
cnt = cnt + len(chunk)
await response.write(chunk)
await response.fsyn()
await response.write_eof()
return response
raise HTTPNotFound

View File

@ -0,0 +1,147 @@
# fileUpload.py
import asyncio
import os
import time
import tempfile
import aiofiles
import json
import time
from appPublic.folderUtils import _mkdir
from appPublic.jsonConfig import getConfig
from appPublic.Singleton import SingletonDecorator
from appPublic.log import info, debug, warning, exception, critical
@SingletonDecorator
class TmpFileRecord:
def __init__(self, timeout=3600):
self.filetime = {}
self.changed_flg = False
self.timeout = timeout
self.time_period = 10
self.filename = self.savefilename()
self.loop = asyncio.get_event_loop()
self.loop.call_later(0.01, self.load)
def newtmpfile(self, path:str):
self.filetime[path] = time.time()
self.change_flg = True
def savefilename(self):
config = getConfig()
root = config.filesroot or tempfile.gettempdir()
pid = os.getpid()
return root + f'/tmpfile_rec_{pid}.json'
async def save(self):
if not self.change_flg:
return
async with aiofiles.open(self.filename, 'bw') as f:
s = json.dumps(self.filetime)
b = s.encode('utf-8')
await f.write(b)
await f.flush()
self.change_flg = False
async def load(self):
fn = self.filename
if not os.path.isfile(fn):
return
async with aiofiles.open(fn, 'br') as f:
b = await f.read()
s = b.decode('utf-8')
self.filetime = json.loads(s)
self.remove()
def file_useful(self, fpath):
try:
del self.filetime[fpath]
except Exception as e:
exception(f'Exception:{str(e)}')
pass
async def remove(self):
tim = time.time()
ft = {k:v for k,v in self.filetime.items()}
for k,v in ft:
if tim - v > self.timeout:
self.rmfile(k)
del self.tiletime[k]
await self.save()
self.loop.call_later(self.time_period, self.remove)
def rmfile(self, name:str):
config = getConfig()
os.remove(config.fileroot + name)
class FileStorage:
def __init__(self):
config = getConfig()
self.root = os.path.abspath(config.filesroot or tempfile.gettempdir())
self.tfr = TmpFileRecord()
def realPath(self,path):
if path[0] == '/':
path = path[1:]
p = os.path.abspath(os.path.join(self.root,path))
return p
def _name2path(self,name, userid=None):
name = os.path.basename(name)
paths=[191,193,197,97]
v = int(time.time()*1000000)
# b = name.encode('utf8') if not isinstance(name,bytes) else name
# v = int.from_bytes(b,byteorder='big',signed=False)
root = self.root
if userid:
root += f'/{userid}'
path = os.path.abspath(os.path.join(root,
str(v % paths[0]),
str(v % paths[1]),
str(v % paths[2]),
str(v % paths[3]),
name))
return path
def remove(self, path):
try:
if path[0] == '/':
path = path[1:]
p = os.path.join(self.root, path)
os.remove(p)
except Exception as e:
exception(f'{path=}, {p=} remove error')
async def save(self,name,read_data, userid=None):
p = self._name2path(name, userid=userid)
fpath = p[len(self.root):]
info(f'{p=}, {fpath=},{self.root} ')
_mkdir(os.path.dirname(p))
if isinstance(read_data, str) or isinstance(read_data, bytes):
b = read_data
if isinstance(read_data, str):
b = read_data.encode('utf-8')
async with aiofiles.open(p, 'wb') as f:
await f.write(b)
await f.flush()
self.tfr.newtmpfile(fpath)
return fpath
async with aiofiles.open(p,'wb') as f:
siz = 0
while 1:
d = await read_data()
if not d:
break
siz += len(d);
await f.write(d)
await f.flush()
self.tfr.newtmpfile(fpath)
return fpath
def file_realpath(path):
fs = FileStorage()
return fs.realPath(path)

View File

@ -0,0 +1,14 @@
import os
def current_fileno():
fn = './t.txt'
f = open(fn, 'w')
ret = f.fileno()
f.close()
os.remove(fn)
return ret
if __name__ == '__main__':
for i in range(1000):
print(current_fileno())

View File

@ -0,0 +1,49 @@
import inspect
from appPublic.dictObject import DictObject
from appPublic.registerfunction import RegisterFunction
from appPublic.log import info, debug, warning, error, exception, critical
from aiohttp import web
from aiohttp.web_response import Response, StreamResponse
from .baseProcessor import BaseProcessor
class FunctionProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return False
def __init__(self,path,resource, opts):
self.config_opts = opts
BaseProcessor.__init__(self,path,resource)
async def path_call(self, request, path):
path1 = request.path[len(self.config_opts['leading']):]
args = []
if len(path1) > 0:
if path1[0] == '/':
path1 = path1[1:]
args += path1.split('/')
rfname = self.config_opts['registerfunction']
ns = DictObject(**self.run_ns)
rf = RegisterFunction()
f = rf.get(rfname)
if f is None:
error(f'{rfname=} is not registered, {rf.registKW=}')
return None
self.run_ns['request'] = request
globals().update(self.run_ns)
if inspect.iscoroutinefunction(f):
return await f(request, self.run_ns, *args)
return f(request, self.run_ns, *args)
async def datahandle(self,request):
x = await self.path_call(request, self.path)
if isinstance(x,web.FileResponse):
self.retResponse = x
elif isinstance(x,Response):
self.retResponse = x
else:
self.content = x

View File

@ -0,0 +1,208 @@
# -*- coding:utf8 -*-
import os
import builtins
import sys
import codecs
from urllib.parse import quote
import json
import asyncio
import random
import time
import datetime
from openpyxl import Workbook
from tempfile import mktemp
from appPublic.jsonConfig import getConfig
from appPublic.dictObject import DictObject
from appPublic.Singleton import GlobalEnv
from appPublic.argsConvert import ArgsConvert
from appPublic.timeUtils import str2Date,str2Datetime,curDatetime,getCurrentTimeStamp,curDateString, curTimeString
from appPublic.dataencoder import quotedstr
from appPublic.folderUtils import folderInfo
from appPublic.uniqueID import setNode,getID
from appPublic.unicoding import unicoding,uDict,uObject
from appPublic.Singleton import SingletonDecorator
from appPublic.rc4 import password
from sqlor.dbpools import DBPools,runSQL,runSQLPaging
from sqlor.filter import DBFilter, default_filterjson
from sqlor.crud import CRUD
from .xlsxData import XLSXData
from .uriop import URIOp
from .error import Success, Error, NeedLogin, NoPermission
from .filetest import current_fileno
from .filestorage import FileStorage
from .serverenv import ServerEnv
def data2xlsx(rows,headers=None):
wb = Workbook()
ws = wb.active
i = 1
if headers is not None:
for j in range(len(headers)):
v = headers[j].title if headers[j].get('title',False) else headers[j].name
ws.cell(column=j+1,row=i,value=v)
i += 1
for r in rows:
for j in range(len(r)):
v = r[headers[j].name]
ws.cell(column=j+1,row=i,value=v)
i += 1
name = mktemp(suffix='.xlsx')
wb.save(filename = name)
wb.close()
return name
async def save_file(str_or_bytes, filename):
fs = FileStorage()
r = await fs.save(filename, str_or_bytes)
return r
class FileOutZone(Exception):
def __init__(self,fp,*args,**kwargs):
super(FileOutZone,self).__init__(*args,**kwargs)
self.openfilename = fp
def __str__(self):
return self.openfilename + ': not allowed to open'
def get_config_value(kstr):
keys = kstr.split('.')
config = getConfig()
if config is None:
raise Exception('getConfig() error')
for k in keys:
config = config.get(k)
if not config:
return None
return config
def get_definition(k):
k = f'definitions.{k}'
return get_config_value(k)
def openfile(url,m):
fp = abspath(url)
if fp is None:
print(f'openfile({url},{m}),url is not match a file')
raise Exception('url can not mathc a file')
config = getConfig()
paths = [ os.path.abspath(p) for p in config.website.paths ]
fs = config.get('allow_folders',[])
fs = [ os.path.abspath(i) for i in fs + paths ]
r = False
for f in fs:
if fp.startswith(f):
r = True
break
if not r:
raise FileOutZone(fp)
return open(fp,m)
def isNone(a):
return a is None
def abspath(path):
config = getConfig()
paths = [ os.path.abspath(p) for p in config.website.paths ]
for root in paths:
p = root + path
if os.path.exists(root+path):
return p
return None
def appname():
config = getConfig()
try:
return config.license.app
except:
return "test app"
def configValue(ks):
config = getConfig()
try:
a = eval('config' + ks)
return a
except:
return None
def visualcoding():
return configValue('.website.visualcoding');
def file_download(request,path,name,coding='utf8'):
f = openfile(path,'rb')
b = f.read()
f.close()
fname = quote(name).encode(coding)
hah = b"attachment; filename=" + fname
# print('file head=',hah.decode(coding))
request.setHeader(b'Content-Disposition',hah)
request.setHeader(b'Expires',0)
request.setHeader(b'Cache-Control',b'must-revalidate, post-check=0, pre-check=0')
request.setHeader(b'Content-Transfer-Encoding',b'binary')
request.setHeader(b'Pragma',b'public')
request.setHeader(b'Content-Length',len(b))
request.write(b)
request.finish()
def initEnv():
pool = DBPools()
g = ServerEnv()
set_builtins()
g.configValue = configValue
g.visualcoding = visualcoding
g.uriop = URIOp
g.isNone = isNone
g.json = json
g.ArgsConvert = ArgsConvert
g.time = time
g.curDateString = curDateString
g.curTimeString = curTimeString
g.datetime = datetime
g.random = random
g.str2date = str2Date
g.str2datetime = str2Datetime
g.curDatetime = curDatetime
g.uObject = uObject
g.uuid = getID
g.runSQL = runSQL
g.runSQLPaging = runSQLPaging
g.runSQLIterator = pool.runSQL
g.runSQLResultFields = pool.runSQLResultFields
g.getTables = pool.getTables
g.getTableFields = pool.getTableFields
g.getTablePrimaryKey = pool.getTablePrimaryKey
g.getTableForignKeys = pool.getTableForignKeys
g.folderInfo = folderInfo
g.abspath = abspath
g.data2xlsx = data2xlsx
g.xlsxdata = XLSXData
g.openfile = openfile
g.CRUD = CRUD
g.DBPools = DBPools
g.DBFilter = DBFilter
g.default_filterjson = default_filterjson
g.Error = Error
g.Success = Success
g.NeedLogin = NeedLogin
g.NoPermission = NoPermission
g.password_encode = password
g.current_fileno = current_fileno
g.get_config_value = get_config_value
g.get_definition = get_definition
g.DictObject = DictObject
g.async_sleep = asyncio.sleep
g.quotedstr = quotedstr
g.save_file = save_file
def set_builtins():
all_builtins = [ i for i in dir(builtins) if not i.startswith('_')]
g = ServerEnv()
gg = globals()
for l in all_builtins:
exec(f'g["{l}"] = {l}',{'g':g})

View File

@ -0,0 +1,81 @@
import aiohttp
from aiohttp import web, BasicAuth
from aiohttp import client
from appPublic.dictObject import DictObject
from .llm_client import StreamLlmProxy, AsyncLlmProxy, SyncLlmProxy
from .baseProcessor import *
class LlmProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='llm'
async def path_call(self, request, params={}):
await self.set_run_env(request)
path = self.path
url = self.resource.entireUrl(request, path)
ns = self.run_ns
ns.update(params)
te = self.run_ns['tmpl_engine']
txt = await te.render(url,**ns)
data = json.loads(txt)
return DictObject(**data)
async def datahandle(self,request):
chunk_size = 40960
d = await self.path_call(request)
llm = StreamLlmProxy(self, d)
self.retResponse = await llm(request, self.run_ns.params_kw)
def setheaders(self):
pass
class LlmSProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='llms'
async def path_call(self, request, params={}):
await self.set_run_env(request)
path = self.path
url = self.resource.entireUrl(request, path)
ns = self.run_ns
ns.update(params)
te = self.run_ns['tmpl_engine']
txt = await te.render(url,**ns)
data = json.loads(txt)
return DictObject(**data)
async def datahandle(self,request):
chunk_size = 40960
d = await self.path_call(request)
llm = SyncLlmProxy(self, d)
self.content = await llm(request, self.run_ns.params_kw)
def setheaders(self):
pass
class LlmAProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='llma'
async def path_call(self, request, params={}):
await self.set_run_env(request)
path = self.path
url = self.resource.entireUrl(request, path)
ns = self.run_ns
ns.update(params)
te = self.run_ns['tmpl_engine']
txt = await te.render(url,**ns)
data = json.loads(txt)
return DictObject(**data)
async def datahandle(self,request):
chunk_size = 40960
d = await self.path_call(request)
llm = AsyncLlmProxy(self, d)
self.retResponse = await llm(request, self.run_ns.params_kw)
def setheaders(self):
pass

View File

@ -0,0 +1,288 @@
import re
import base64
import json
from traceback import print_exc
from aiohttp import web
from appPublic.dictObject import DictObject
from appPublic.httpclient import HttpClient, RESPONSE_TEXT, RESPONSE_JSON, RESPONSE_BIN,RESPONSE_FILE, RESPONSE_STREAM
from appPublic.argsConvert import ArgsConvert
def encode_imagefile(fn):
with open(fn, 'rb') as f:
return base64.b64encode(f.read()).decode('utf-8')
class StreamLlmProxy:
def __init__(self, processor, desc):
assert desc.name
self.name = desc.name
self.processor = processor
self.auth_api = desc.auth
self.desc = desc
self.api_name = desc.name
self.data = DictObject()
self.ac = ArgsConvert('${', '}')
def line_chunk_match(self, l):
if self.api.chunk_match:
match = re.search(self.api.chunk_match, l)
if match:
return match.group(1)
return l
async def write_chunk(self, ll):
def eq(a, b):
return a == b
def ne(a, b):
return a != b
opfuncs = {
'==':eq,
'!=':ne
}
if '[DONE]' in ll:
return
try:
# print('write_chunk(),l=', ll)
l = self.line_chunk_match(ll)
d = DictObject(** json.loads(l))
j = {}
for r in self.api.resp or []:
j[r.name] = d.get_data_by_keys(r.value);
if self.api.chunk_filter:
v = d.get_data_by_keys(self.api.chunk_filter.name)
v1 = self.api.chunk_filter.value
op = self.api.chunk_filter.op
f = opfuncs.get(op)
if f and f(v,v1):
j[self.api.chunk_filter.field] = ''
print('filtered j=', j)
jstr = json.dumps(j) + '\n'
bin = jstr.encode('utf-8')
await self.resp.write(bin)
await self.resp.drain()
except Exception as e:
print(f'Error:Write_chunk(),{l=} error:{e=}')
print_exc()
async def stream_handle(self, chunk):
print('chunk=', chunk)
chunk = chunk.decode('utf-8')
chunk = self.remain_str + chunk
lines = chunk.split('\n')
self.remain_str = lines[-1]
ls = lines[:-1]
for l in ls:
if l == '':
continue
await self.write_chunk(l)
async def get_apikey(self, apiname):
f = self.processor.run_ns.get_llm_user_apikey
if f:
# return a DictObject instance
return await f(apiname, self.user)
raise Exception('get_llm_user_apikey() function not found in ServerEnv')
async def do_auth(self, request):
d = self.desc.auth
self.data = self.get_data(self.name)
if self.data.authed:
return
self.data = await self.get_apikey(self.name)
if self.data is None:
raise Exception(f'user({self.user}) do not has a apikey for {self.name}')
method = d.get('method', 'POST')
headers = {}
for h in d.get('headers',{}):
headers[h.get('name')] = h.get('value')
mydata = {}
for p in d.data or []:
mydata[p.get('name')] = p.get('value')
myparams = {}
for p in d.params or []:
myparams[p.get('name')] = p.get('value')
url = d.get('url')
params = {}
_params = self.datalize(myparams, params)
_headers = self.datalize(headers, params)
_data = self.datalize(mydata, params)
hc = HttpClient()
resp_data = await hc.request(url, method, response_type=RESPONSE_JSON,
params=_params,
data=None if _data == {} else json.dumps(_data),
headers=_headers)
resp_data = DictObject(**resp_data)
for sd in d.set_data:
self.data[sd.name] = resp_data.get_data_by_keys(sd.field)
self.data.authed = True
self.set_data(self.name, self.data)
def data_key(self, apiname):
if self.user is None:
self.user = 'anonymous'
return apiname + '_a_' + self.user
def set_data(self, apiname, data):
request = self.processor.run_ns.request
app = request.app
app.set_data(self.data_key(apiname), data)
def get_data(self, apiname):
request = self.processor.run_ns.request
app = request.app
return app.get_data(self.data_key(apiname))
async def __call__(self, request, params):
self.user = await self.processor.run_ns.get_user()
mapi = params.mapi
stream = params.stream
self.resp = web.StreamResponse()
await self.resp.prepare(request)
if stream is None:
stream = True
self.remain_str = ''
if not self.desc[mapi]:
raise Exception(f'{mapi} not defined')
d = self.desc[mapi]
self.api = d
self.chunk_match = d.chunk_match
if self.api.need_auth and self.auth_api:
await self.do_auth(request)
else:
self.data = await self.get_apikey(self.name)
assert d.get('url')
method = d.get('method', 'POST')
headers = {}
for h in d.get('headers',{}):
headers[h.get('name')] = h.get('value')
mydata = {}
for p in d.get('data', {}):
mydata[p.get('name')] = p.get('value')
myparams = {}
for p in d.get('params', {}):
myparams[p.get('name')] = p.get('value')
url = d.get('url')
_params = self.datalize(myparams, params)
_headers = self.datalize(headers, params)
_data = self.datalize(mydata, params)
response_type = RESPONSE_STREAM
hc = HttpClient()
print(f'{url=},{method=},{_params=},{_data=},{_headers=}')
resp_data = await hc.request(url, method, response_type=response_type,
params=_params,
data=None if _data == {} else json.dumps(_data),
stream_func=self.stream_handle,
headers=_headers)
if self.remain_str != '':
await self.write_chunk(self.remain_str)
return self.resp
def datalize(self, dic, data={}):
mydata = self.data.copy()
mydata.update(data)
s1 = self.ac.convert(dic, mydata)
return s1
class SyncLlmProxy(StreamLlmProxy):
async def __call__(self, request, params):
self.user = await self.processor.run_ns.get_user()
mapi = params.mapi
if not self.desc[mapi]:
return {
"status":"Error",
"message":f'{mapi} not defined'
}
d = self.desc[mapi]
self.api = d
if self.api.need_auth and self.auth_api:
await self.do_auth(request)
else:
self.data = await self.get_apikey(self.name)
assert d.get('url')
method = d.get('method', 'POST')
headers = {}
for h in d.get('headers',{}):
headers[h.get('name')] = h.get('value')
mydata = {}
for p in d.get('data', {}):
mydata[p.get('name')] = p.get('value')
myparams = {}
for p in d.get('params', {}):
myparams[p.get('name')] = p.get('value')
url = d.get('url')
_params = self.datalize(myparams, params)
_headers = self.datalize(headers, params)
_data = self.datalize(mydata, params)
response_type = RESPONSE_JSON
hc = HttpClient()
print(f'{url=},{method=},{_params=},{_data=},{_headers=}')
resp_data = await hc.request(url, method, response_type=response_type,
params=_params,
data=None if _data == {} else json.dumps(_data),
headers=_headers)
print(f'{resp_data=}')
resp_data = DictObject(resp_data)
if resp_data is None:
return {
"status":"Error",
"message":f'{mapi} not defined'
}
return self.convert_resp(resp_data)
def convert_resp(self, resp):
j = {}
for r in self.api.resp or []:
j[r.name] = resp.get_data_by_keys(r.value);
return j
class AsyncLlmProxy(StreamLlmProxy):
pass
class AsyncLlmProxy:
async def __call__(self, request, params):
self.user = await self.processor.run_ns.get_user()
mapi = params.mapi
stream = params.stream
self.resp = web.StreamResponse()
await self.resp.prepare(request)
if stream is None:
stream = True
self.remain_str = ''
if not self.desc[mapi]:
raise Exception(f'{mapi} not defined')
d = self.desc[mapi]
self.api = d
self.chunk_match = d.chunk_match
if self.api.need_auth and self.auth_api:
await self.do_auth(request)
else:
self.data = await self.get_apikey(self.name)
assert d.get('url')
method = d.get('method', 'POST')
headers = {}
for h in d.get('headers',{}):
headers[h.get('name')] = h.get('value')
mydata = {}
for p in d.get('data', {}):
mydata[p.get('name')] = p.get('value')
myparams = {}
for p in d.get('params', {}):
myparams[p.get('name')] = p.get('value')
url = d.get('url')
_params = self.datalize(myparams, params)
_headers = self.datalize(headers, params)
_data = self.datalize(mydata, params)
response_type = RESPONSE_JSON
hc = HttpClient()
print(f'{url=},{method=},{_params=},{_data=},{_headers=}')
resp_data = await hc.request(url, method, response_type=response_type,
params=_params,
data=None if _data == {} else json.dumps(_data),
headers=_headers)
if self.remain_str != '':
await self.write_chunk(self.remain_str)
return self.resp

View File

@ -0,0 +1,29 @@
import os
import sys
from appPublic.folderUtils import listFile
from appPublic.ExecFile import ExecFile
from ahserver.serverenv import ServerEnv
import appPublic
import sqlor
import ahserver
def load_plugins(p_dir):
ef = ExecFile()
pdir = os.path.join(p_dir, 'plugins')
if not os.path.isdir(pdir):
# print('load_plugins:%s not exists' % pdir)
return
sys.path.append(pdir)
ef.set('sys',sys)
ef.set('ServerEnv', ServerEnv)
for m in listFile(pdir, suffixs='.py'):
if m == '__init__.py':
continue
if not m.endswith('.py'):
continue
# print(f'{m=}')
module = os.path.basename(m[:-3])
# print('module=', module)
__import__(module, locals(), globals())

View File

@ -0,0 +1,59 @@
import os
import codecs
from appPublic.Singleton import SingletonDecorator
from appPublic.jsonConfig import getConfig
from jinja2 import Template,Environment, BaseLoader
from .serverenv import ServerEnv
from .url2file import Url2File, TmplUrl2File
class TmplLoader(BaseLoader, TmplUrl2File):
def __init__(self, paths, indexes, subffixes=['.tmpl'], inherit=False):
BaseLoader.__init__(self)
TmplUrl2File.__init__(self,paths,indexes=indexes,subffixes=subffixes, inherit=inherit)
def get_source(self,env: Environment,template: str):
config = getConfig()
coding = config.website.coding
fp = self.url2file(template)
# print(f'{template=} can not transfer to filename')
if not os.path.isfile(fp):
raise TemplateNotFound(template)
mtime = os.path.getmtime(fp)
with codecs.open(fp,'r',coding) as f:
source = f.read()
return source,fp,lambda:mtime == os.path.getmtime(fp)
def join_path(self,name, parent):
return self.relatedurl(parent,name)
def list_templates(self):
return []
class TemplateEngine(Environment):
def __init__(self,loader=None):
Environment.__init__(self,loader=loader, enable_async=True)
self.urlpaths = {}
self.loader = loader
def join_path(self,template: str, parent: str):
return self.loader.join_path(template, parent)
async def render(self,___name: str, **globals):
t = self.get_template(___name,globals=globals)
return await t.render_async(globals)
def setupTemplateEngine():
config = getConfig()
subffixes = [ i[0] for i in config.website.processors if i[1] == 'tmpl' ]
loader = TmplLoader(config.website.paths,
config.website.indexes,
subffixes,
inherit=True)
engine = TemplateEngine(loader)
g = ServerEnv()
g.tmpl_engine = engine

View File

@ -0,0 +1,39 @@
from aiohttp import web
from p2psc.pubkey_handler import PubkeyHandler
from p2psc.p2psc import P2psc
class P2pLayer
def __init__(self):
self.p2pcrypt = False
config = getConfig()
if config.website.p2pcrypt:
self.p2pcrypt = True
if not self.p2pcrypt:
return
self.handler = PubkeyHandler()
self.p2p = P2psc(self.handler, self.handler.get_myid())
@web.middleware
async def p2p_middle(self, request, handler):
if not p2pscrypr:
return await handler(request)
if request.headers.get('P2pHandShake', None):
resturen await self.p2p_handshake(request)
if request.header.get('P2pdata', None):
request = await self.p2p_decode_request(request)
resp = await handler(request)
return await self.p2p_encode_response(resp)
return handler(request)
async def p2p_handshake(self, request):
pass
async def p2p_decode_request(self, request):
pass
async def p2p_encode_response(self, response):
return response

View File

@ -0,0 +1,446 @@
import os
import re
import codecs
import aiofiles
from traceback import print_exc
# from showcallstack import showcallstack
import asyncio
import json
from yarl import URL
from aiohttp import client
from aiohttp_auth import auth
from appPublic.http_client import Http_Client
from functools import partial
from aiohttp_auth import auth
from aiohttp.web_urldispatcher import StaticResource, PathLike
from aiohttp.web_urldispatcher import Optional, _ExpectHandler
from aiohttp.web_urldispatcher import Path
from aiohttp.web_response import Response, StreamResponse
from aiohttp.web_exceptions import (
HTTPException,
HTTPExpectationFailed,
HTTPForbidden,
HTTPMethodNotAllowed,
HTTPNotFound,
HTTPFound,
)
from aiohttp.web_fileresponse import FileResponse
from aiohttp.web_request import Request
from aiohttp.web_response import Response, StreamResponse
from aiohttp.web_routedef import AbstractRouteDef
from appPublic.jsonConfig import getConfig
from appPublic.dictObject import DictObject
from appPublic.i18n import getI18N
from appPublic.dictObject import DictObject, multiDict2Dict
from appPublic.timecost import TimeCost
from appPublic.timeUtils import timestampstr
from appPublic.log import info, debug, warning, error, critical, exception
from .baseProcessor import getProcessor, BricksUIProcessor, TemplateProcessor
from .baseProcessor import PythonScriptProcessor, MarkdownProcessor
from .xlsxdsProcessor import XLSXDataSourceProcessor
from .llmProcessor import LlmProcessor, LlmSProcessor, LlmAProcessor
from .websocketProcessor import WebsocketProcessor, XtermProcessor
from .sqldsProcessor import SQLDataSourceProcessor
from .functionProcessor import FunctionProcessor
from .proxyProcessor import ProxyProcessor
from .serverenv import ServerEnv
from .url2file import Url2File
from .filestorage import FileStorage, file_realpath
from .restful import DBCrud
from .dbadmin import DBAdmin
from .filedownload import file_download, path_decode
from .utils import unicode_escape
from .filetest import current_fileno
from .auth_api import user_login, user_logout, get_session_user
def getHeaderLang(request):
al = request.headers.get('Accept-Language')
if al is None:
return 'en'
return al.split(',')[0]
def i18nDICT(request):
c = getConfig()
i18n = getI18N()
lang = getHeaderLang(request)
l = c.langMapping.get(lang,lang)
return json.dumps(i18n.getLangDict(l)).encode(c.website.coding)
class ProcessorResource(StaticResource,Url2File):
def __init__(self, prefix: str, directory: PathLike,
*, name: Optional[str]=None,
expect_handler: Optional[_ExpectHandler]=None,
chunk_size: int=256 * 1024,
show_index: bool=False, follow_symlinks: bool=False,
append_version: bool=False,
indexes:list=[],
processors:dict={}) -> None:
StaticResource.__init__(self,prefix, directory,
name=name,
expect_handler=expect_handler,
chunk_size=chunk_size,
show_index=show_index,
follow_symlinks=follow_symlinks,
append_version=append_version)
Url2File.__init__(self,directory,prefix,indexes,inherit=True)
gr = self._routes.get('GET')
self._routes.update({'POST':gr})
self._routes.update({'PUT':gr})
self._routes.update({'OPTIONS':gr})
self._routes.update({'DELETE':gr})
self._routes.update({'TRACE':gr})
self.y_processors = processors
self.y_prefix = prefix
self.y_directory = directory
self.y_indexes = indexes
self.y_env = DictObject()
def setProcessors(self, processors):
self.y_processors = processors
def setIndexes(self, indexes):
self.y_indexes = indexes
def abspath(self, request, path:str):
url = self.entireUrl(request, path)
path = self.url2path(url)
fname = self.url2file(path)
return fname
async def getPostData(self,request: Request) -> DictObject:
qd = {}
if request.query:
qd = multiDict2Dict(request.query)
reader = None
try:
reader = await request.multipart()
except:
# print('reader is None')
pass
if reader is None:
pd = await request.post()
pd = multiDict2Dict(pd)
if pd == {}:
if request.can_read_body:
x = await request.read()
try:
pd = json.loads(x)
except:
# print('body is not a json')
pass
qd.update(pd)
return DictObject(**qd)
ns = qd
while 1:
try:
field = await reader.next()
if not field:
break
value = ''
if hasattr(field,'filename') and field.filename is not None:
saver = FileStorage()
userid = await get_session_user(request)
value = await saver.save(field.filename,field.read_chunk, userid=userid)
else:
value = await field.read(decode=True)
value = value.decode('utf-8')
ov = ns.get(field.name)
if ov:
if type(ov) == type([]):
ov.append(value)
else:
ov = [ov,value]
else:
ov = value
ns.update({field.name:ov})
# print(f'getPostData():{ns=}')
except Exception as e:
print(e)
print_exc()
print('-----------except out ------------')
break;
return DictObject(ns)
def parse_request(self, request):
"""
get real schema, host, port, prepath
and save it to self._{attr}
"""
self._scheme = request.scheme
self._scheme = request.headers.get('X-Forwarded-Scheme',request.scheme)
k = request.host.split(':')
host = k[0]
port = 80
if len(k) == 2:
port = int(k[1])
elif self._scheme.lower() == 'https':
port = 443
self._host = request.headers.get('X-Forwarded-Host', host)
self._port = request.headers.get('X-Forwarded-Port', port)
self._prepath = request.headers.get('X-Forwarded-Prepath', '')
if self._prepath != '':
self._prepath = '/' + self._prepath
self._preurl = f'{self._scheme}://{self._host}:{self._port}{self._prepath}'
# print(f'{request.path=}, {self._preurl=}')
async def _handle(self,request:Request) -> StreamResponse:
clientkeys = {
"iPhone":"iphone",
"iPad":"ipad",
"Android":"androidpad",
"Windows Phone":"winphone",
"Windows NT[.]*Win64; x64":"pc",
}
def i18nDICT():
c = getConfig()
g = ServerEnv()
if not g.get('myi18n',False):
g.myi18n = getI18N()
lang = getHeaderLang(request)
l = c.langMapping.get(lang,lang)
return json.dumps(g.myi18n.getLangDict(l))
def getClientType(request):
agent = request.headers.get('user-agent')
if type(agent)!=type('') and type(agent)!=type(b''):
return 'pc'
for k in clientkeys.keys():
m = re.findall(k,agent)
if len(m)>0:
return clientkeys[k]
return 'pc'
def serveri18n(s):
lang = getHeaderLang(request)
c = getConfig()
g = ServerEnv()
if not g.get('myi18n',False):
g.myi18n = getI18N()
l = c.langMapping.get(lang,lang)
return g.myi18n(s,l)
async def getArgs() -> DictObject:
if request.method == 'POST':
return await self.getPostData(request)
ns = multiDict2Dict(request.query)
return DictObject(**ns)
async def redirect(url):
url = self.entireUrl(request, url)
raise HTTPFound(url)
async def remember_user(userid):
await user_login(request, userid)
async def remember_ticket(ticket):
await auth.remember_ticket(request, ticket)
async def get_ticket():
return await auth.get_ticket(request)
async def forget_user():
await user_logout(request)
async def get_user():
return await get_session_user(request)
self.parse_request(request)
self.y_env.i18n = serveri18n
self.y_env.file_realpath = file_realpath
self.y_env.redirect = redirect
self.y_env.info = info
self.y_env.error = error
self.y_env.debug = debug
self.y_env.warning = warning
self.y_env.critical = critical
self.y_env.exception = exception
self.y_env.remember_user = remember_user
self.y_env.forget_user = forget_user
self.y_env.get_user = get_user
self.y_env.i18nDict = i18nDICT
self.y_env.terminalType = getClientType(request)
self.y_env.entire_url = partial(self.entireUrl,request)
self.y_env.websocket_url = partial(self.websocketUrl,request)
self.y_env.abspath = self.abspath
self.y_env.request2ns = getArgs
self.y_env.aiohttp_client = client
self.y_env.resource = self
self.y_env.gethost = partial(self.gethost, request)
self.y_env.path_call = partial(self.path_call,request)
self.user = await auth.get_auth(request)
self.y_env.user = self.user
self.request_filename = self.url2file(str(request.path))
request['request_filename'] = self.request_filename
path = request.path
config = getConfig()
request['port'] = config.website.port
if config.website.dbadm and path.startswith(config.website.dbadm):
pp = path.split('/')[2:]
if len(pp)<3:
error('%s:not found' % str(request.url))
raise HTTPNotFound
dbname = pp[0]
tablename = pp[1]
action = pp[2]
adm = DBAdmin(request,dbname,tablename,action)
return await adm.render()
if config.website.dbrest and path.startswith(config.website.dbrest):
pp = path.split('/')[2:]
if len(pp)<2:
error('%s:not found' % str(request.url))
raise HTTPNotFound
dbname = pp[0]
tablename = pp[1]
id = None
if len(pp) > 2:
id = pp[2]
crud = DBCrud(request,dbname,tablename,id=id)
return await crud.dispatch()
if config.website.download and path.startswith(config.website.download):
pp = path.split('/')[2:]
if len(pp)<1:
error('%s:not found' % str(request.url))
raise HTTPNotFound
dp = '/'.join(pp)
path = path_decode(dp)
return await file_download(request, path)
processor = self.url2processor(request, str(request.url), self.request_filename)
if processor:
ret = await processor.handle(request)
return ret
if self.request_filename and await self.isHtml(self.request_filename):
return await self.html_handle(request, self.request_filename)
if self.request_filename and os.path.isdir(self.request_filename):
config = getConfig()
if not config.website.allowListFolder:
error('%s:not found' % str(request.url))
raise HTTPNotFound
# print(f'{self.request_filename=}, {str(request.url)=} handle as a normal file')
return await super()._handle(request)
def gethost(self, request):
host = request.headers.get('X-Forwarded-Host')
if host:
return host
host = request.headers.get('Host')
if host:
return host
return '/'.join(str(request.url).split('/')[:3])
async def html_handle(self,request,filepath):
async with aiofiles.open(filepath,'r', encoding='utf-8') as f:
txt = await f.read()
utxt = txt.encode('utf-8')
headers = {
'Content-Type': 'text/html; utf-8',
'Accept-Ranges': 'bytes',
'Content-Length': str(len(utxt))
}
resp = Response(text=txt,headers=headers)
return resp
async def isHtml(self,fn):
try:
async with aiofiles.open(fn,'r',encoding='utf-8') as f:
b = await f.read()
while b[0] in ['\n',' ','\t']:
b = b[1:]
if b.lower().startswith('<html>'):
return True
if b.lower().startswith('<!doctype html>'):
return True
except Exception as e:
return False
def url2processor(self, request, url, fpath):
config = getConfig()
url1 = url
url = self.entireUrl(request, url)
host = '/'.join(url.split('/')[:3])
path = '/' + '/'.join(url.split('/')[3:])
if config.website.startswiths:
for a in config.website.startswiths:
if path.startswith(a.leading):
processor = FunctionProcessor(path,self,a)
return processor
if fpath is None:
print(f'fpath is None ..., {url=}, {url1=}')
return None
for word, handlername in self.y_processors:
if fpath.endswith(word):
Klass = getProcessor(handlername)
try:
processor = Klass(path,self)
# print(f'{f_cnt1=}, {f_cnt2=}, {f_cnt3=}, {f_cnt4=}, {f_cnt5=}')
return processor
except Exception as e:
print('Exception:',e, 'handlername=', handlername)
return None
return None
def websocketUrl(self, request, url):
url = entireUrl(request, url)
if url.startswith('https'):
return 'wss' + url[5:]
return 'ws' + url[4:]
def urlWebsocketify(self, url):
if url.endswith('.ws') or url.endswith('.wss'):
if url.startswith('https'):
return 'wss' + url[5:]
return 'ws' + url[4:]
return url
def entireUrl(self, request, url):
ret_url = ''
if url.startswith('http://') or \
url.startswith('https://') or \
url.startswith('ws://') or \
url.startswith('wss://'):
ret_url = url
elif url.startswith('/'):
u = f'{self._preurl}{url}'
# print(f'entireUrl(), {u=}, {url=}, {self._preurl=}')
ret_url = u
else:
path = request.path
p = self.relatedurl(path,url)
u = f'{self._preurl}{p}'
ret_url = u
return self.urlWebsocketify(ret_url)
def url2path(self, url):
if url.startswith(self._preurl):
return url[len(self._preurl):]
return url
async def path_call(self, request, path, params={}):
url = self.entireUrl(request, path)
# print(f'{path=}, after entireUrl(), {url=}')
path = self.url2path(url)
fpath = self.url2file(path)
processor = self.url2processor(request, path, fpath)
# print(f'path_call(), {path=}, {url=}, {fpath=}, {processor=}, {self._prepath}')
new_request = request.clone(rel_url=path)
return await processor.be_call(new_request, params=params)

View File

@ -0,0 +1,53 @@
import aiohttp
from appPublic.log import info, debug, warning, error, critical, exception
from aiohttp import web, BasicAuth
from aiohttp import client
from .baseProcessor import *
class ProxyProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='proxy'
async def path_call(self, request, params={}):
await self.set_run_env(request)
path = self.path
url = self.resource.entireUrl(request, path)
ns = self.run_ns
ns.update(params)
te = self.run_ns['tmpl_engine']
txt = await te.render(url,**ns)
data = json.loads(txt)
debug('proxyProcessor: data=%s' % data)
return data
async def datahandle(self,request):
chunk_size = 40960
d = await self.path_call(request)
reqH = request.headers.copy()
auth = None
if d.get('user') and d.get('password'):
auth = BasicAuth(d['user'], d['password'])
async with client.request(
request.method,
d['url'],
auth=auth,
headers = reqH,
allow_redirects=False,
data=await request.read()) as res:
headers = res.headers.copy()
# body = await res.read()
self.retResponse = web.StreamResponse(
headers = headers,
status = res.status
# ,body=body
)
await self.retResponse.prepare(request)
async for chunk in res.content.iter_chunked(chunk_size):
await self.retResponse.write(chunk)
debug('proxy: datahandle() finish')
def setheaders(self):
pass

View File

@ -0,0 +1,121 @@
import os
import re
import traceback
from aiohttp.web_response import Response
from aiohttp.web_exceptions import (
HTTPException,
HTTPExpectationFailed,
HTTPForbidden,
HTTPMethodNotAllowed,
HTTPNotFound,
)
from aiohttp import web
from aiohttp.web_request import Request
from aiohttp.web import json_response
from sqlor.dbpools import DBPools
from appPublic.dictObject import multiDict2Dict
from appPublic.jsonConfig import getConfig
from .error import Error,Success
DEFAULT_METHODS = ('GET', 'POST', 'PUT', 'DELETE', 'HEAD', 'OPTIONS', 'TRACE')
class RestEndpoint:
def __init__(self):
self.methods = {}
for method_name in DEFAULT_METHODS:
method = getattr(self, method_name.lower(), None)
if method:
self.register_method(method_name, method)
def register_method(self, method_name, method):
self.methods[method_name.upper()] = method
async def dispatch(self):
method = self.methods.get(self.request.method.lower())
if not method:
raise HTTPMethodNotAllowed('', DEFAULT_METHODS)
return await method()
class DBCrud(RestEndpoint):
def __init__(self, request,dbname,tablename, id=None):
super().__init__()
self.dbname = dbname
self.tablename = tablename
self.request = request
self.db = DBPools()
self.id = id
async def options(self) -> Response:
try:
with self.db.sqlorContext(self.dbname) as sor:
d = await sor.I(self.tablename)
return json_response(Success(d))
except Exception as e:
print(e)
traceback.print_exc()
return json_response(Error(errno='metaerror',msg='get metadata error'))
async def get(self) -> Response:
"""
query data
"""
try:
ns = multiDict2Dict(self.request.query)
with self.db.sqlorContext(self.dbname) as sor:
d = await sor.R(self.tablename, ns)
return json_response(Success(d))
except Exception as e:
print(e)
traceback.print_exc()
return json_response(Error(errno='search error',msg='search error'))
async def post(self):
"""
insert data
"""
try:
ns = multiDict2Dict(await self.request.post())
with self.db.sqlorContext(self.dbname) as sor:
d = await sor.C(self.tablename, ns)
return json_response(Success(d))
except Exception as e:
print(e)
traceback.print_exc()
return json_response(Error(errno='add error',msg='add error'))
async def put(self):
"""
update data
"""
try:
ns = multiDict2Dict(await self.request.post())
with self.db.sqlorContext(self.dbname) as sor:
d = await sor.U(self.tablename, ns)
return json_response(Success(' '))
except Exception as e:
print(e)
traceback.print_exc()
return json_response(Error(errno='update error',msg='update error'))
async def delete(self, request: Request, instance_id):
"""
delete data
"""
try:
ns = multiDict2Dict(self.request.query)
with self.db.sqlorContext(self.dbname) as sor:
d = await sor.D(self.tablename, ns)
return json_response(Success(d))
except Exception as e:
print(e)
traceback.print_exc()
return json_response(Error(erron='delete error',msg='error'))

View File

@ -0,0 +1,25 @@
from appPublic.Singleton import SingletonDecorator
from appPublic.dictObject import DictObject
@SingletonDecorator
class ServerEnv(DictObject):
pass
clientkeys = {
"iPhone":"iphone",
"iPad":"ipad",
"Android":"androidpad",
"Windows Phone":"winphone",
"Windows NT[.]*Win64; x64":"pc",
}
def getClientType(request):
agent = request.headers.get('user-agent')
for k in clientkeys.keys():
m = re.findall(k,agent)
if len(m)>0:
return clientkeys[k]
return 'pc'

View File

@ -0,0 +1,75 @@
import codecs
from .dsProcessor import DataSourceProcessor
from appPublic.jsonConfig import getConfig
from sqlor.dbpools import DBPools
import json
"""
sqlds file format:
{
"sqldesc":{
"sql_string":"select * from dbo.stock_daily_hist where stock_num=${stock_num}$ order by trade_date desc",
"db":"mydb",
"sortfield":"stock_date"
}
"arguments":[
{
"name":"stock_num",
"type":"str",
"iotype":"text",
"default":"600804"
}
],
"datadesc":[
{
}
]
}
"""
class SQLDataSourceProcessor(DataSourceProcessor):
@classmethod
def isMe(self,name):
return name=='sqlds'
def getArgumentsDesc(self,dict_data,ns,request):
desc = dict_data.get('arguments',None)
return desc
async def getDataDesc(self,dict_data,ns,request):
pool = DBPools()
@pool.runSQLResultFields
def sql(dbname,NS):
sqldesc = dict_data.get('sqldesc')
# print('sql(),sqldesc=',sqldesc)
return sqldesc
rec = dict_data.get('datadesc',None)
if rec is None:
sqldesc = dict_data.get('sqldesc')
ns = dict_data.get('arguments',{})
rec = [ r for r in sql(sqldesc['db'],ns) if r['name']!='_row_id' ]
dict_data['datadesc'] = rec
f = codecs.open(self.src_file,'w',self.config.website.coding)
b = json.dumps(dict_data,indent=4)
f.write(b)
f.close()
return rec
async def getData(self,dict_data,ns,request):
pool = DBPools()
@pool.runSQL
def sql(dbname,NS):
sqldesc = dict_data.get('sqldesc')
return sqldesc
db = dict_data['sqldesc']['db']
ret = [ i for i in await sql(db,ns) ]
return ret
async def getPagingData(self,dict_data,ns,request):
pool = DBPools()
@pool.runSQLPaging
def sql(dbname,NS):
sqldesc = dict_data.get('sqldesc')
return sqldesc
db = dict_data['sqldesc']['db']
ret = await sql(db,ns)
return ret

21
build/lib/ahserver/t.py Normal file
View File

@ -0,0 +1,21 @@
import os
def url2ospath(root, url: str) -> str:
url = url.split('?')[0]
if len(url) > 0 and url[-1] == '/':
url = url[:-1]
paths = url.split('/')
if url.startswith('http://') or url.startswith('https://'):
paths = paths[3:]
f = os.path.join(root,*paths)
real_path = os.path.abspath(f)
return real_path
def x(url):
root = '/G/ttt'
print(url2ospath(root, url))
x('/kboss/index.html')
x('kboss/index.html')
x('http://hhh.hhh/kboss/index.html')
x('https://hhh.hhh/kboss/index.html')

View File

@ -0,0 +1,83 @@
#
import os
import codecs
from appPublic.jsonConfig import getConfig
from appPublic.folderUtils import folderInfo
class URIopException(Exception):
def __init__(self,errtype,errmsg):
self.errtype = errtype
self.errmsg = errmsg
super(URIopException,self).init('errtype=%s,errmsg=%s' % (errtype,errmsg))
def __str__(self):
return 'errtype=%s,errmsg=%s' % (self.errtype,self.errmsg)
class URIOp(object):
def __init__(self):
self.conf = getConfig()
self.realPath = os.path.abspath(self.conf.website.root)
def abspath(self,uri=None):
p = self.conf.website.root
if uri is not None and len(uri)>0:
x = uri
if x[0] == '/':
x = x[1:]
p = os.path.join(p,*x.split('/'))
d = os.path.abspath(p)
if len(d) < len(self.realPath):
raise URIopException('url scope error',uri);
if d[:len(self.realPath)] != self.realPath:
raise URIopException('url scope error',uri);
return d
def fileList(self,uri=''):
r = [ i for i in folderInfo(self.realPath,uri) ]
for i in r:
if i['type']=='dir':
i['state'] = 'closed'
i['id'] = '_#_'.join(i['id'].split('/'))
ret={
'total':len(r),
'rows':r
}
return ret
def mkdir(self,at_uri,name):
p = self.abspath(at_uri)
p = os.path.join(p,name)
os.mkdir(p)
def rename(self,uri,newname):
p = self.abspath(uri)
dir = os.path.dirname(p)
np = os.path.join(p,newname)
os.rename(p,np)
def delete(self,uri):
p = self.abspath(uri)
os.remove(p)
def save(self,uri,data):
p = self.abspath(uri)
f = codecs.open(p,"w",self.conf.website.coding)
f.write(data)
f.close()
def read(self,uri):
p = self.abspath(uri)
f = codecs.open(p,"r",self.conf.website.coding)
b = f.read()
f.close()
return b
def write(self,uri,data):
p = self.abspath(uri)
f = codecs.open(p,"w",self.conf.website.coding)
f.write(data)
f.close()

View File

@ -0,0 +1,114 @@
import os
class Url2File:
def __init__(self,path:str,prefix: str,
indexes: list, inherit: bool=False):
self.rootpath = path
self.starts = prefix
self.indexes = indexes
self.inherit = inherit
def realurl(self,url:str) -> str :
items = url.split('/')
items = [ i for i in items if i != '.' ]
while '..' in items:
for i,v in enumerate(items):
if v=='..' and i > 0:
del items[i]
del items[i-1]
break
return '/'.join(items)
def url2ospath(self, url: str) -> str:
url = url.split('?')[0]
if len(url) > 0 and url[-1] == '/':
url = url[:-1]
paths = url.split('/')
if url.startswith('http://') or \
url.startswith('https://') or \
url.startswith('ws://') or \
url.startswith('wss://'):
paths = paths[3:]
f = os.path.join(self.rootpath,*paths)
real_path = os.path.abspath(f)
# print(f'{real_path=}, {url=}, {f=}')
return real_path
def url2file(self, url: str) -> str:
ourl = url
url = url.split('?')[0]
real_path = self.url2ospath(url)
if os.path.isdir(real_path):
for idx in self.indexes:
p = os.path.join(real_path,idx)
if os.path.isfile(p):
# print(f'{url=}, {real_path=}, {idx=}, {p=}')
return p
if os.path.isfile(real_path):
return real_path
if not os.path.isdir(os.path.dirname(real_path)):
# print(f'url2file() return None, {real_path=}, {url=},{ourl=}, {self.rootpath=}')
return None
if not self.inherit:
# print(f'url2file() return None, self.inherit is false, {url:}, {self.rootpath=}')
return None
items = url.split('/')
if len(items) > 2:
del items[-2]
oldurl = url
url = '/'.join(items)
# print(f'{oldurl=}, {url=}')
return self.url2file(url)
# print(f'url2file() return None finally, {items:}, {url=}, {ourl=}, {self.rootpath=}')
return None
def relatedurl(self,url: str, name: str) -> str:
if len(url) > 0 and url[-1] == '/':
url = url[:-1]
fp = self.url2ospath(url)
if os.path.isfile(fp):
items = url.split('/')
del items[-1]
url = '/'.join(items)
url = url + '/' + name
return self.realurl(url)
def relatedurl2file(self,url: str, name: str):
url = self.relatedurl(url,name)
return self.url2file(url)
class TmplUrl2File:
def __init__(self,paths,indexes, subffixes=['.tmpl','.ui' ],inherit=False):
self.paths = paths
self.u2fs = [ Url2File(p,prefix,indexes,inherit=inherit) \
for p,prefix in paths ]
self.subffixes = subffixes
def url2file(self,url):
for u2f in self.u2fs:
fp = u2f.url2file(url)
if fp:
return fp
return None
def relatedurl(self,url: str, name: str) -> str:
for u2f in self.u2fs:
fp = u2f.relatedurl(url, name)
if fp:
return fp
return None
def list_tmpl(self):
ret = []
for rp,_ in self.paths:
p = os.path.abspath(rp)
[ ret.append(i) for i in listFile(p,suffixs=self.subffixes,rescursive=True) ]
return sorted(ret)

View File

@ -0,0 +1,4 @@
def unicode_escape(s):
x = [ch if ord(ch) < 256 else ch.encode('unicode_escape').decode('utf-8') for ch in s]
return ''.join(x)

View File

@ -0,0 +1 @@
__version__ = '0.3.4'

View File

@ -0,0 +1,183 @@
import asyncio
import aiohttp
import aiofiles
import json
import codecs
from aiohttp import web
import aiohttp_cors
from traceback import print_exc
from appPublic.sshx import SSHNode
from .baseProcessor import BaseProcessor, PythonScriptProcessor
class XtermProcessor(PythonScriptProcessor):
@classmethod
def isMe(self,name):
return name=='xterm'
async def ws_2_process(self, ws):
async for msg in ws:
if msg.type == aiohttp.WSMsgType.TEXT:
self.p_obj.stdin.write(msg.data)
elif msg.type == aiohttp.WSMsgType.ERROR:
# print('ws connection closed with exception %s' % ws.exception())
return
async def process_2_ws(self, ws):
while self.running:
x = await self.p_obj.stdout.read(1024)
await self.ws_sendstr(ws, x)
async def datahandle(self,request):
await self.path_call(request)
async def path_call(self, request, params={}):
#
# xterm file is a python script as dspy file
# it must return a DictObject with sshnode information
# parameters: nodeid
#
login_info = super().path_call(request, params=params)
ws = web.WebSocketResponse()
await ws.prepare(request)
await self.create_process(login_info)
self.ws_sendstr(ws, 'Welcom to sshclient')
r1 = self.ws_2_process(ws)
r2 = self.process_2_ws(ws)
await asyncio.gather(r1,r2)
self.retResponse = ws
return ws
async def get_login_info(self):
async with aiofiles.open(self.real_path, 'r', encoding='utf-8') as f:
txt = await f.read()
self.login_info = json.loads(txt)
# print(f'{self.login_info=}')
async def create_process(self, lgoin_info):
# id = lenv['params_kw'].get('termid')
host = login_info['host']
port = login_info.get('port', 22)
username = login_info.get('username', 'root')
password = login_info.get('password',None)
self.sshnode = SSHNode(host, username=username,
password=password,
port=port)
await self.sshnode.connect()
self.p_obj = await self.sshnode._process('bash',
term_type='vt100',
term_size=(80, 24),
encoding='utf-8')
self.running = True
async def ws_sendstr(self, ws:web.WebSocketResponse, s:str):
data = {
"type":1,
"data":s
}
await ws.send_str(json.dumps(data))
def close_process(self):
self.sshnode.close()
self.p_obj.close()
async def ws_send(ws:web.WebSocketResponse, data):
if not isinstance(data, str):
data = json.dumps(data)
d = {
"type":1,
"data":data
}
d = json.dumps(d)
try:
return await ws.send_str(d)
except Exception as e:
print('ws.send_str() error:', e)
return False
class WsPool:
def __init__(self, ws, ws_path, app):
self.app = app
self.id = None
self.ws = ws
print(f'-----------{self.id=}-----------')
self.ws_path = ws_path
def get_data(self):
return self.app.get_data(self.ws_path)
def set_data(self, data):
self.app.set_data(self.ws_path, data)
def is_online(self, userid):
data = self.get_data()
ws = data.get(userid)
if ws is None:
return False
return True
def register(self, id):
self.id = id
self.add_mine()
def add_mine(self):
data = self.get_data()
if data is None:
data = {}
data.update({self.id:self.ws})
self.set_data(data)
def delete_mine(self):
data = self.get_data()
if not data.get(self.id):
return
data = {k:v for k,v in self.data.items() if k != self.id }
self.set_data(data)
async def sendto(self, data, id=None):
if id is None:
return await ws_send(self.ws, data)
d = self.get_data()
ws = d.get(id)
return await ws_send(ws, data)
class WebsocketProcessor(PythonScriptProcessor):
@classmethod
def isMe(self,name):
return name=='ws'
async def path_call(self, request,params={}):
await self.set_run_env(request)
lenv = self.run_ns.copy()
lenv.update(params)
params_kw = lenv.params_kw
userid = lenv.params_kw.userid or await lenv.get_user()
del lenv['request']
txt = await self.loadScript(self.real_path)
ws = web.WebSocketResponse()
try:
await ws.prepare(request)
except Exception as e:
print('--------except:', e, type(e))
print_exc()
raise e
ws_pool = WsPool(ws, request.path, request.app)
async for msg in ws:
if msg.type == aiohttp.WSMsgType.TEXT:
if msg.data == 'exit':
break
lenv['ws_data'] = msg.data
lenv['ws_pool'] = ws_pool
exec(txt,lenv,lenv)
func = lenv['myfunc']
resp = await func(request,**lenv)
# await self.ws_sendstr(ws, resp)
elif msg.type == aiohttp.WSMsgType.ERROR:
# print('ws connection closed with exception %s' % ws.exception())
break
else:
print('datatype error', msg.type)
ws_pool.delete_mine()
self.retResponse = ws
await ws.close()
return ws

View File

@ -0,0 +1,128 @@
from openpyxl import load_workbook
import json
"""
xlsxds file format:
{
"xlsxfile":"./data.xlsx",
"data_from":7,
"data_sheet":"Sheet1",
"label_at",1,
"name_at":null,
"datatype_at":2,
"ioattrs_at":3,
"listhide_at":4,
"inputhide_at":5,
"frozen_at":6
}
"""
class XLSXData:
def __init__(self,path,desc):
self.desc = desc
self.xlsxfile = path
self.workbook = load_workbook(self.xlsxfile)
self.ws = self.workbook[self.desc['data_sheet']]
def getBaseFieldsInfo(self):
ws = self.workbook[self.desc['data_sheet']]
ret = []
for y in range(1,ws.max_column+1):
r = {
'name':self._fieldName(ws,y),
'label':self._fieldLabel(ws,y),
'type':self._fieldType(ws,y),
'listhide':self._isListHide(ws,y),
'inputhide':self._isInputHide(ws,y),
'frozen':self._isFrozen(ws,y)
}
r.update(self._fieldIOattrs(ws,y))
ret.append(r)
return ret
def _fieldName(self,ws,i):
x = self.desc.get('name_at')
if x is not None:
return ws.cell(x,i).value
return 'f' + str(i)
def _fieldLabel(self,ws,i):
x = self.desc.get('label_at',1)
if x is not None:
return ws.cell(x,i).value
return 'f' + str(i)
def _fieldType(self,ws,i):
x = self.desc.get('datatype_at')
if x is not None:
return ws.cell(x,i).value
return 'str'
def _fieldIOattrs(self,ws,i):
x = self.desc.get('ioattrs_at')
if x is not None:
t = ws.cell(x,i).value
if t is not None:
try:
return json.loads(t,'utf-8')
except Exception as e:
print('xlsxData.py:field=',i,'t=',t,'error')
return {}
def _isFrozen(self,ws,i):
x = self.desc.get('frozen_at')
if x is not None:
t = ws.cell(x,y).value
if t == 'Y' or t == 'y':
return True
return False
def _isListHide(self,ws,i):
x = self.desc.get('listhide_at')
if x is not None:
t = ws.cell(x,i).value
if t == 'Y' or t == 'y':
return True
return False
def _isInputHide(self,ws,i):
x = self.desc.get('inputhide_at')
if x is not None:
t = ws.cell(x,i).value
if t == 'Y' or t == 'y':
return True
return False
def getPeriodData(self,min_r,max_r):
ws = self.ws
rows = []
assert(min_r >= self.desc.get('data_from',2))
if max_r > ws.max_row:
max_r = ws.max_row + 1;
if min_r <= max_r:
x = min_r;
while x < max_r:
d = {}
for y in range(1,ws.max_column+1):
name = self._fieldName(ws,y)
d.update({name:ws.cell(column=y,row=x).value})
rows.append(d)
x = x + 1
return rows
def getArgumentsDesc(self,ns,request):
return None
def getData(self,ns):
ws = self.ws
min_r = self.desc.get('data_from',2)
return self.getPeriodData(min_r,ws.max_row + 1)
def getPagingData(self,ns):
rows = int(ns.get('rows',50))
page = int(ns.get('page',1))
d1 = self.desc.get('data_from',2)
min_r = (page - 1) * rows + d1
max_r = page * rows + d1 + 1
rows = self.getPeriodData(min_r,max_r)
ret = {
'total':self.ws.max_row - d1,
'rows':rows
}
return ret

View File

@ -0,0 +1,51 @@
import codecs
from openpyxl import load_workbook
from appPublic.jsonConfig import getConfig
from .dsProcessor import DataSourceProcessor
from .xlsxData import XLSXData
"""
xlsxds file format:
{
"xlsxfile":"./data.xlsx",
"data_from":7,
"data_sheet":"Sheet1",
"label_at",1,
"name_at":null,
"datatype_at":2,
"ioattrs":3,
"listhide_at":4,
"inputhide_at":5,
"frozen_at":6
}
"""
class XLSXDataSourceProcessor(DataSourceProcessor):
@classmethod
def isMe(self,name):
return name=='xlsxds'
def getArgumentsDesc(self,dict_data,ns,request):
return None
async def getDataDesc(self,dict_data,ns,request):
path = dict_data.get('xlsxfile',None)
self.xlsxdata = XLSXData(self.g.abspath(self.g.absurl(request,path)),dict_data)
ret = self.xlsxdata.getBaseFieldsInfo(ns)
return ret
async def getData(self,dict_data,ns,request):
path = dict_data.get('xlsxfile',None)
self.xlsxdata = XLSXData(self.g.abspath(self.g.absurl(request,path)),dict_data)
ret = self.xlsxdata.getData(ns)
return ret
async def getPagingData(self,dict_data,ns,request):
path = dict_data.get('xlsxfile',None)
self.xlsxdata = XLSXData(self.g.abspath(ns.absurl(request,path)),dict_data)
ret = self.xlsxdata.getPagingData(ns)
return ret

5
change.log Executable file
View File

@ -0,0 +1,5 @@
# 2023-06-18
change permission check to a single function:checkUserPermission for auth_api.py
# 2023-06-22
fix bug for getPostData to support post data in request.query

3
changelog.txt Executable file
View File

@ -0,0 +1,3 @@
2023-06-12
modify auth_api.py to with only needAuth(), checkUserPermission(user, path) method

83
conf/config.json Executable file
View File

@ -0,0 +1,83 @@
{
"debug":true,
"databases":{
"aiocfae":{
"driver":"aiomysql",
"async_mode":true,
"coding":"utf8",
"dbname":"cfae",
"kwargs":{
"user":"test",
"db":"cfae",
"password":"test123",
"host":"localhost"
}
},
"cfae":{
"driver":"mysql.connector",
"coding":"utf8",
"dbname":"cfae",
"kwargs":{
"user":"test",
"db":"cfae",
"password":"test123",
"host":"localhost"
}
}
},
"filesroot":"$[workdir]$/files",
"website":{
"paths":[
["$[workdir]$/../usedpkgs/antd","/antd"],
["$[workdir]$/../wolon",""]
],
"host":"0.0.0.0",
"port":8080,
"coding":"utf-8",
"ssl_gg":{
"crtfile":"$[workdir]$/conf/www.bsppo.com.pem",
"keyfile":"$[workdir]$/conf/www.bsppo.com.key"
},
"indexes":[
"index.html",
"index.tmpl",
"index.dspy",
"index.md"
],
"dbrest":"/dbs",
"download":"/download",
"visualcoding":{
"default_root":"/samples/vc/test",
"userroot":{
"ymq":"/samples/vc/ymq",
"root":"/samples/vc/root"
},
"jrjpath":"/samples/vc/default"
},
"processors":[
[".xlsxds","xlsxds"],
[".sqlds","sqlds"],
[".tmpl.js","tmpl"],
[".tmpl.css","tmpl"],
[".html.tmpl","tmpl"],
[".tmpl","tmpl"],
[".dspy","dspy"],
[".md","md"]
],
"startswith":{
"/thumb":{
"registerfunction":"makeThumb",
"options":{
"width":256,
"keep_ratio":1
}
}
}
},
"langMapping":{
"zh-Hans-CN":"zh-cn",
"zh-CN":"zh-cn",
"en-us":"en",
"en-US":"en"
}
}

0
i18n/en/msg.txt Executable file
View File

0
i18n/zh-cn/msg.txt Executable file
View File

17
requirements.txt Executable file
View File

@ -0,0 +1,17 @@
asyncio
aiofiles
aiodns
cchardet
aiohttp
aiohttp_session
aiohttp_auth_autz
aiohttp-cors
aiomysql
aioredis
psycopg2-binary
aiopg
jinja2
ujson
openpyxl
pillow
py-natpmp

53
setup.py Executable file
View File

@ -0,0 +1,53 @@
# -*- coding: utf-8 -*-
from distutils.core import setup
try:
from setuptools import setup, find_packages
except:
from distutils.core import setup
from ahserver.version import __version__
# usage:
# python setup.py bdist_wininst generate a window executable file
# python setup.py bdist_egg generate a egg file
# Release information about eway
version = __version__
name = "ahserver"
description = "ahserver"
author = "yumoqing"
email = "yumoqing@gmail.com"
required = []
with open('requirements.txt', 'r') as f:
ls = f.read()
required = ls.split('\n')
packages=find_packages()
package_data = {}
with open("README.md", "r") as fh:
long_description = fh.read()
setup(
name=name,
version=version,
# uncomment the following lines if you fill them out in release.py
description=description,
author=author,
author_email=email,
install_requires=required,
packages=packages,
package_data=package_data,
keywords = [
],
url="https://github.com/yumoqing/ahserver",
long_description=long_description,
long_description_content_type="text/markdown",
classifiers = [
'Operating System :: OS Independent',
'Programming Language :: Python :: 3',
'License :: OSI Approved :: MIT License',
],
)