Can't get working Examples for type: "string[]"
What happened?
- I set my configuration to:
* @bodyParam urls string[] Urls to check for. Example: ["https://example.com","https://example.com/test"]
- Then I ran
php artisan scribe:generate - But I saw

instead of...

Screenshots and stack traces:
My environment:
- PHP version (from
php -v):PHP 7.4.21 (cli) (built: Sep 30 2021 13:37:02) ( NTS ) Copyright (c) The PHP Group Zend Engine v3.4.0, Copyright (c) Zend Technologies with Xdebug v2.9.8, Copyright (c) 2002-2020, by Derick Rethans - Framework (Laravel/Lumen): Laravel 8.72.0
- Scribe version (from
composer show knuckleswtf/scribe): 3.16.0
Well, that's a very minor bug. Should be easy to fix. Just like you have this line that sets the example as input value for primitive types
https://github.com/knuckleswtf/scribe/blob/6a001e403b65567be4089e26ab18bd4f23dd2216/resources/views/components/field-details.blade.php#L57
you can do something similar for lists here:
https://github.com/knuckleswtf/scribe/blob/6a001e403b65567be4089e26ab18bd4f23dd2216/resources/views/components/field-details.blade.php#L44-L52
Hello, I have similar problem. Need to receive an array of strings (images base64):
Payload example:
{ ... "files": [ "based64Image"
Body param:
@bodyParam files string[] attachments in base64. Example: ["iVBORw0KGgoAAAA..."]
Having this validation rule on request object (about 50mb + base64 encode 20mb):
'files.*' => ['string', 'max:70000000'],
When I do:
php artisan scribe:generate
Leads to:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 67853424 bytes) in /opt/atlassian/pipelines/agent/build/vendor/fakerphp/faker/src/Faker/Provider/Base.php on line 374 [2022-09-13 09:01:41] testing.ERROR: Allowed memory size of 134217728 bytes exhausted (tried to allocate 67853424 bytes) {"exception":"[object] (Symfony\\Component\\ErrorHandler\\Error\\FatalError(code: 0): Allowed memory size of 134217728 bytes exhausted (tried to allocate 67853424 bytes) at /opt/atlassian/pipelines/agent/build/vendor/fakerphp/faker/src/Faker/Provider/Base.php:374) [stacktrace]
Env: Scribe 3.31.0 Laravel 9.1.2
Could you suggest how to proceed?
@vovech seems to work without any problems here. If are you using form request, set the bodyParameters method and remove the docblock from your method. you are getting out of memory error because scribe is ignoring the "Example" in your docblock and making one using factory. The problem is... its a string with 70000000 chars.
Use formrequests with bodyParameters method or without a formRequest using docblock bodyParam, not both.
class PostController extends Controller {
/**
* @bodyParam files string[] attachments in base64. Example: ["iVBORw0KGgoAAAA..."]
*/
public function create(Request $request) {
\Validator::make($request->all(),['files.*' => ['string', 'max:70000000']])->validate();
}
}
or
class CreatePostRequest extends FormRequest {
public function rules(): array {
return [
'files.*' => ['string', 'max:70000000'],
];
}
public function bodyParameters(){
return [
'files.*' => [
'example' => 'iVBORw0KGgoAAAA...'
]
];
}
}
class PostController extends Controller {
public function create(CreatePostRequest $request) {
//..
}
}
Sorry, didn't clarify one point. Locally works fine. Problem is memory usage. But for example on bitbucket pipeline there is a php memory usage constraint to 128mb. If I use bigger file size, memory use raise exponentially.
But as a sugestion, dont use base64 to make upload validations, use file's rule. You gonna expend alot of memory on server storing large base64 strings on variables.
public function create(CreatePostRequest $request) {
Validator::make($request->all(), [
'files.*' => ['file', 'mimes:pdf,png,jpg' ],
])->validate();
}
Sorry, didn't clarify one point. Locally works fine. Problem is memory usage. But for example on bitbucket pipeline there is a php memory usage constraint to 128mb. If I use bigger file size, memory use raise exponentially.
its because you are storing large strings on variables. Probably you have 128mb of max memory on php settings. And probably you dont need more then that.
@vovech i suggest you send the files as form data instead of base64
Sorry, didn't clarify one point. Locally works fine. Problem is memory usage. But for example on bitbucket pipeline there is a php memory usage constraint to 128mb. If I use bigger file size, memory use raise exponentially.
its because you are storing large strings on variables. Probably you have 128mb of max memory on php settings. And probably you dont need more then that.
The main issue here is that I would expect this library to read an Example payload instead, using the maximum allowed length for this field. Do you think is possible?
Of course there is a workaround as you mention by using formdata or either chunked payload if there is no solution. But there is a consistency idea here. You expect every endpoint to behave in the same way, in this case receiving JSON data.
Thanks in advance.
yep you can, do as i say here
The main issue here is that I would expect this library to read an Example payload instead
Normally it does, but you're using a FormRequest too, so it will also read that. Which is why @leandrodiogenes suggests removing the bodyParam tag and replacing with the method in your FormRequest.
Cool, thanks. Will try.